Echo Dot (3rd Gen) - Smart speaker with Alexa - Charcoal

Use your voice to play a song, artist, or genre through Amazon Music, Apple Music, Spotify, Pandora, and others. With compatible Echo devices in different rooms, you can fill your whole home with music.

Buy Now

Wireless Rechargeable Battery Powered WiFi Camera.

Wireless Rechargeable Battery Powered WiFi Camera is home security camera system lets you listen in and talk back through the built in speaker and microphone that work directly through your iPhone or Android Mic.

Buy Now

Running Out of Fuel And AI Autonomous Cars


Running Out of Fuel And AI Autonomous Cars 1
AI self-driving cars need ways to respond when the vehicle is low or has run out of fuel. (GETTY IMAGES)

By Lance Eliot, the AI Trends Insider

Running out of fuel. It’s a pain in the neck.

During my college days, a friend of mine had an old jalopy of a car that was busted-up and yet it still managed to passably work, but the gas gauge seemed to have a mind of its own.

I say this because sometimes the fuel gauge would be showing that the tank was half full, even though it was nearly empty, and at other times the gauge needle was on empty but there was actually a half of tank or more of gas in it. It was a wild guessing game as to how much fuel there really was in his car at any point in time.

Some affectionately referred to it as the Guess-o-Meter.

He carried in the trunk of his car a gas can filled with about one to two gallons of gasoline, doing so as a precaution in case he caught completely tricked by the gauge and ran out of gas while on-the-road. Everyone told him that carrying around spare gasoline in the trunk of his car was a nutty and highly dangerous practice. It wouldn’t have taken much to have that gasoline get ignited, or perhaps if he got into a fender bender the gasoline might cause a small accident to turn into a torrent of fire.

You might assume that he should know how much gasoline he has in the tank of his car.

All he would need to do is keep track of his fill-ups at the gas station and then track his mileage.

By generally knowing how many miles per gallon his car was getting, he presumably could keep tabs on how much gas is likely left in the tank. Yes, this would have been a means to do a workaround regarding the broken gas gauge. Unfortunately, he was the type of person that would not have been organized enough and systematic enough to actually do the kind of mathematical tracking that was suggested (I suppose the fact that he was allowing his car to continue to have a suspect gas gauge was a clear sign that he wasn’t the kind of person that did things rigorously!).

We had a running gag that whenever you saw him, you would ask him how much gasoline he had left in his tank. I had thought he would tire of the joke and perhaps it would spur him to fix the gas gauge. Nope, it did not spur him to action. If anything, it seemed like he enjoyed the notoriety of being the guy that had a wild gas gauge. If the gauge didn’t work at all, it would have been much less of a story. The aspect that it would willy-nilly seem to decide how much gas was in the car made it much more entertaining as a story to be told.

I haven’t yet mentioned the times that he did run out of gas.

Most of the time that it happened, we never knew, since he would discreetly make use of his spare gas stored in the trunk and then head to the nearest gas station to fill the tank. I have no idea how many times he did this. I’d bet that it was a regular occurrence and likely he was continually running out of gas and having to use his “fail-safe” operation to get back into business.

Many people don’t realize that allowing your car to run out of gas can be bad in other ways, beyond just getting stranded someplace.

The car doesn’t especially like the notion of you letting the gas entirely be used up. The gas lines can get air in them, which can make it hard to restart the car once you’ve gotten more gas into the tank. Other things can go wrong with the engine by running out of gas. I know that my friend had to often tinker with doing various repairs under-the-hood and I suspect it was probably partially due to how he treated the car by letting it run out of gas.

He was so careless that he sometimes failed to re-fill his spare container of gas.

In other words, he would use the gas can to provide gas once his car ran out of gas, he would then drive to the nearest gas station to fill-up the car, and he should have also then re-filled the gas can (I am not advocating the use of the gas can, I am merely saying that if that’s what he was going to use as his back-up, he ought to make sure it was ready to be his back-up). Upon forgetting to refill his spare storage of gas, it then put him into the posture of having no back-up for when his car might run out of gas.

I was with him in his car on an occasion wherein this lack of being properly prepared arose. He was driving along Pacific Coast Highway (PcH, as it’s locally called), a scenic road in California that generally parallels the ocean.

We ran out of gas. No problem, he insisted, and got out of the car to grab the spare canister of gas in the trunk.

I waited for him to quickly give us sufficient gas to then reach the next gas station. He got back into the car with a rather sad and sorrowful look on his face. Oops, the spare gas can was empty. One of us, he announced, would need to walk with the gas can to a gas station and bring back enough gas to get us underway again.

One of us?

Wasn’t it his responsibility to make sure that his car has sufficient gas?

Believe it or not, he expressed that since we were both in his car, and both enjoying the use of his car, it really was the responsibility of both of us to tend to the car.

Maybe that makes sense to you, but I assure you it made little sense to me at the time. I was perturbed.

In this case, we decided to push the car off the roadway and we luckily found a parking spot that we could have the car sit in without the threat of it being towed. We then both walked together to the nearest gas station. I suppose it is a memorable story now. At the time, I was irked the entire time of the adventure. From that point forward, whenever I got into his car, I would right away ask to check the spare gas can so that we would not get into a similar predicament again (I suppose that’s my accepting the idea that we both had a joint duty or responsibility when both were in the car).

Even knowing that the spare gas can had enough gas for a quick trip to the gas station was not always so comforting.


We were driving through the Grapevine, which is a somewhat mountainous pass that has a long and winding highway that gets quite steep at times, and I suddenly realized that there weren’t any gas stations for miles upon miles.

A road sign said that the next gas station was a great distance away. I pondered whether the spare gas would be sufficient to get us to the gas station, if we perchance ran out of gas.

When I mentioned to my friend that maybe we ought to reconsider our journey, he pointed out that we’d likely be Okay, especially if we reduced our gas consumption while in the Grape Vine. He proceeded to turn-off the Air Conditioning (AC). I’d like to mention herein that this was a hot summer day with outdoor temperatures ranging around 100 degrees or more. We began to swelter inside the car. He also turned-off the radio and anything else that he figured might use up gas.

We were nearly sitting on pins and needles as we traversed the Grapevine. We kept the windows rolled-up because we had assumed that the aerodynamics of the car would be better if the windows were closed rather than opened. Given the heat outdoors and the mounting heat inside the car with its windows all closed, I wondered which would happen first, we would die of heat stroke or we would run out of gas.

He also offered a “clever” driving ploy (according to him). The Grapevine has portions that go up, and portions that go down. You are gradually gaining altitude to make it through the mountains. You then lose the altitudes as you get past the crest and make your way back down towards the ground level. My friend pointed out that on the down slopes he would take his foot off the gas and we could coast. This would “for sure” reduce the amount of gas that we were using.

He also moved the car into the slow lane and tried to go around 35 to 45 miles per hour.

He claimed that according to various gas mileage charts, the sweet spot for using the least amount of gas was around that speed. He said that when you drive at 55 miles per hour or faster, you are disproportionately using up gasoline. As you can guess, he had gradually learned all the ways to try and stretch out your gasoline use, though some of his ideas were borne of myth more than facts.

My children know that to this day I am a bit of a stickler about keeping gas in our cars.

When the needle shows a quarter tank of gas left, I’m on the hunt to find a gas station. When they were learning to drive, I tried to instill this same approach into their driving style. They thought I was crazy. To them, let the needle get down to the level that the car tells you it is low. Most cars will provide a chime, or maybe a visual display or alert, and even let you know how many miles you can still go before running out of petrol.

I suppose it might have to do with the era of having a friend that had a Guess-o-Meter.

I grew-up becoming suspicious about the gas gauge. No sense in letting things get too close to the end, became my motto.

Trust the car dashboard to take me up to the edge of the tank?

No, thanks. I’ll be attentive to my fuel needs and make sure to not reach the precipice, thanks.


According to statistics reported by the American Automotive Association (AAA), they respond to about 16 million or more calls per year for being out of fuel.

Of course, that number is only accounting for those that have the AAA service.

How many people per year really get stranded by running out of fuel?

I’d bet it is an even bigger number.

When I refer to running out of gas, I am also alluding to running out of electrical charge too.

In other words, running out of fuel encompasses both gasoline powered cars (often referred to as ICE or Internal Combustion Engines), and also electrically charged cars.

Colleagues that have Electrical Vehicles (EV) and sometimes say they are low on gas, even though they know and I know that it is not gasoline but instead an electrical charge.

One of the drawbacks right now with EV’s is the aspect of finding a place to charge your EV.

Given the somewhat narrow range of miles that today’s EV’s can go on a single charge, you need to be mindful of where the charging stations are. Unlike gas stations that seem to sit on every corner, you aren’t going to as likely be able to find a place to charge your car.

This is gradually changing as more charger locations are established, but for now, it can be a bit dicey to be using an EV for any kind of long-range driving.

Some of the tow services are now carrying with them a fast-charger to aid EV’s that have run out of fuel.

These nifty and life rescuing fast-chargers can potentially give your EV an added 10 miles of range by charging your car in about 10 minutes. I mention this aspect because another trade-off for some about EV’s versus gasoline powered cars is that with gasoline you can in just a few moments fill your tank and be back on the road, while with most EV’s you need to sit around for a while to let the charger get charged up.

Some EV’s have a special slow-speed mode that allows you to stretch out the electrical power and thus make it to a charger someplace nearby. This feature is sometimes referred to as the turtle-mode or the crawl-to-a-charger mode. I suppose you can liken this to the tricks that my friend and I tried to play while on the Grape Vine, including turning off the AC, turning off the radio, keeping the windows rolled-up. Those same kinds of tricks can be used for an EV.

With some EV’s, depending upon the model, you can potentially use the capability of regeneration to gain some added electrical charge. This generally takes the energy used by the brakes and returns it to some degree back into electrical storage bank. There are research efforts underway to have specialized tires that turn the roadway friction into electrical power for your EV. There are some that are also trying to put solar panels on the exterior of EV’s, allowing for the catching of the sun’s rays to charge the car. In one sense, you might say these are techniques akin to my friend’s use of a spare gas can, though obviously much safer and sensible.

For more about EV’s and electrical power aspects, see my article:


One of the extremely dangerous aspects about running out of fuel involves a car that is in-motion, which presumably is mainly when you would discover you are out of fuel.

Without fuel to run the engine, your car now becomes a danger to those in it and those nearby to your car. You might be able to coast for a little while, but your overall ability to maneuver is now dramatically stinted.

The other day there was a driver on the freeway that appeared to run out of fuel.

You could see his frantic look as he tried desperately to get his car over to the slow lane and then into the emergency lane. Freeway traffic had been moving along at a fast clip and so other cars were zooming past him, not wanting to be delayed or disrupted due to whatever crazy thing he was trying to do. These other drivers could care less about the driver that was frantically clawing his way to the side of the road. It was a quite dangerous situation.

One does have little sympathy for a driver that runs out of fuel.

Did you not get a warning by your car that you were nearing the end of your fuel?

Did you not take it seriously?

Why didn’t you sooner try to find a place to get fuel, rather than waiting until you actually completely were out of fuel?

Don’t you know how dangerous your coasting car can be?

What kind of a person let’s this happen?

Tow truck drivers often get the wildest stories from those that have gotten stranded and are out of fuel.

Some culprits will just admit they ignored the warnings.

Some say they thought the miles left to go was enough to make it to a place to fill-up.

Some say their baby in the back-seat was crying and so they got distracted from the fuel gauge.

Some claim that the auto makers purposely have the gauge tell you one thing, such as that you have 10 miles left to go, but those auto makers actually know you have 20 miles left to go, and they do this to fool you into sooner getting refueled. I guess these people then figure that since they “know the trick” they can go ahead and ignore the 10 miles and assume it is more like 20 miles. It takes all kinds.

For more about conspiracy theories, see my article:

In any case, we have the grave danger of a car that has run out of fuel and that is in the midst of being on our roadways and has the potential for becoming an unguided missile.

I say unguided and realize you might object and say that the coasting car should be able to be guided by the driver.

Keep in mind that for some cars, trying to “guide” a moving car that has run out of fuel is not so easy. Also, you no longer can presumably use acceleration to get yourself out of a pickle.

From time-to-time, I have gotten into traffic snarls that involved a car that was stranded in the middle of the roadway.

Some of those instances have been cars that ran out of fuel. This brings up another aspect about the car that has run out of fuel, namely, whether the driver will be able to or even want to get out of the stream of traffic. I realize you probably assume that people would for sure try to get over to the side of the road. Apparently, some people are so terrified when they run out of gas that they just come to a stop in the middle of the roadway (or, it could also be that other cars would not let them get over to the side, so the driver figured they would just stay in the middle of the road).

These cars stranded in the middle of a roadway are absolutely a life-or-death danger. Other cars can ram directly into the stranded car. The occupants of the stranded car can get killed or injured, and the same can happen to the occupants of the car that hits the stranded car (plus, any residual crashes that occur as a result of the primary crash).

Being in the middle of the road, the occupants typically cannot get out of the car, for fear of getting hit by nearby moving cars. They instead sit there, a target, waiting for some unobservant driver to plow into them. I’ll point out too that just because you can get your car to the side of the road, if feasible, it still does not mean you are safe per se. There are often cars sitting at the side of the road that get hit by other cars. You are likely somewhat safer to be at the side of the road than otherwise, but it obviously is still a dicey situation.

This also brings up another facet.

We might all have sympathy for someone that has a car that suffers a mechanical breakdown, doing so while you are driving along on the roadways. Depending upon the nature of the mechanical failure, you might have a difficult time trying to get the car to the side of the road. You might not realize the magnitude of the mechanical failure and end-up stranded in the middle of the roadway. I dare say we all dread such a situation and can be sympathetic to anyone that finds themselves in such a boat.

But, suppose you knew that the person driving the car knew that their car was already having mechanical problems. Thus, they knowingly opted to get onto the roadway and endanger themselves and others. It wasn’t as though a lightning bolt from the sky suddenly struck their car, and instead it was something they should have tended toward before it got out-of-control. I think our sympathy would be a lot less.

If that’s the case, what kind of sympathy should we have for the person that runs out of fuel? Some would say none. You should have zero sympathy for such people. They were likely forewarned by their car. They drive a car as a privilege granted by the state. They are supposed to be responsible drivers. This includes ensuring that your car has sufficient fuel that it won’t run out of fuel. No excuses.

Well, whether or not we should have any sympathy is neither here nor there, herein. The reality is that people do have their cars run out of fuel. It happens. It can occur by happenstance or it can occur by lack of attention or care. However it happens, once it happens, there’s a potential danger for all parties, including those in the car and those anywhere near to the car.


What does this have to do with AI self-driving driverless autonomous cars?

At the Cybernetic AI Self-Driving Car Institute, we are developing AI software for self-driving cars. One aspect of interest is what to do about getting low on fuel and how the AI should deal with such a matter.

Allow me to elaborate.

I’d like to first clarify and introduce the notion that there are varying levels of AI self-driving cars. The topmost level is considered Level 5. A Level 5 self-driving car is one that is being driven by the AI and there is no human driver involved. For the design of Level 5 self-driving cars, the automakers are even removing the gas pedal, brake pedal, and steering wheel, since those are contraptions used by human drivers. The Level 5 self-driving car is not being driven by a human and nor is there an expectation that a human driver will be present in the self-driving car. It’s all on the shoulders of the AI to drive the car.

For self-driving cars less than a Level 4 and Level 5, there must be a human driver present in the car. The human driver is currently considered the responsible party for the acts of the car. The AI and the human driver are co-sharing the driving task. In spite of this co-sharing, the human is supposed to remain fully immersed into the driving task and be ready at all times to perform the driving task. I’ve repeatedly warned about the dangers of this co-sharing arrangement and predicted it will produce many untoward results.

For my overall framework about AI self-driving cars, see my article:

For the levels of self-driving cars, see my article:

For why AI Level 5 self-driving cars are like a moonshot, see my article:

For the dangers of co-sharing the driving task, see my article:

Let’s focus herein on the Level 4 and Level 5 self-driving cars. Much of the comments apply to the less than Level 4 self-driving cars too, but the fully autonomous AI self-driving car will receive the most attention in this discussion.

Here’s the usual steps involved in the AI driving task:

  • Sensor data collection and interpretation
  • Sensor fusion
  • Virtual world model updating
  • AI action planning
  • Car controls command issuance

Another key aspect of AI self-driving cars is that they will be driving on our roadways in the midst of human driven cars too. There are some pundits of AI self-driving cars that continually refer to a Utopian world in which there are only AI self-driving cars on the public roads. Currently there are about 250+ million conventional cars in the United States alone, and those cars are not going to magically disappear or become true Level 5 AI self-driving cars overnight.

Indeed, the use of human driven cars will last for many years, likely many decades, and the advent of AI self-driving cars will occur while there are still human driven cars on the roads. This is a crucial point since this means that the AI of self-driving cars needs to be able to contend with not just other AI self-driving cars, but also contend with human driven cars. It is easy to envision a simplistic and rather unrealistic world in which all AI self-driving cars are politely interacting with each other and being civil about roadway interactions. That’s not what is going to be happening for the foreseeable future. AI self-driving cars and human driven cars will need to be able to cope with each other.

For my article about the grand convergence that has led us to this moment in time, see:

See my article about the ethical dilemmas facing AI self-driving cars:

For potential regulations about AI self-driving cars, see my article:

For my predictions about AI self-driving cars for the 2020s, 2030s, and 2040s, see my article:

Returning to the topic of running out of fuel, let’s consider how things will differ or be the same in a world that includes AI self-driving cars of the Level 4 and Level 5 autonomous style.


Presumably, the AI will be able to detect the amount of fuel available in the self-driving car.

I’ll for the moment put to the side the circumstances wherein the fuel detection system is faulty.

This could happen and I just want to point out that it could happen. This is sometimes a surprising notion to those that believe that AI self-driving cars will somehow have super powers and never fail.

What, they ask, you are suggesting that somehow the fuel detection system might not work?

Impossible, they say, this is a flawless AI self-driving car that has no imperfections.

I am not sure what makes these people think that AI self-driving cars will never have any issues. I suppose they watch a lot of science fiction movies in which the systems of the future never falter or breakdown. Or, maybe they have fallen for some kind of marketing drivel that showcases AI self-driving cars as shiny vehicles that exude perfection.

Anyway, get used to the idea that in the real-world there are going to be AI self-driving cars that suffer mechanical breakdowns, might have recalls, and have the same kind of frailties as conventional cars do.

For more about recalls of AI self-driving cars, see my article:

For the robot freezing problem and AI self-driving cars, see my article:

For various issues that can arise in a self-driving car, see my article:

For those that are idealists about AI self-driving cars, see my article:

Hopefully, the AI should be developed such that if the self-driving runs out of fuel, even if the fuel detection system is claiming there is fuel, the AI will have a fallback operation that can then try to get the self-driving car into a minimal risk condition, such as getting over to the side of the road. I realize that the odds of the fuel detection system being so far off that the AI gets caught unawares would seem generally unlikely, but the fact is that it could happen and thus there needs to be a contingency by the AI for this possibility.

We can also ponder whether the AI would even be able to realize that the fuel is exhausted.

Imagine that the fuel detection system is informing the AI that there is a half tank full of fuel. Meanwhile, the self-driving car starts to sputter and the engine dies. The AI now has a situation of the self-driving car’s engine no longer working, but the AI is getting meanwhile no particular indication of why. The fuel volume seems to be just fine, and yet the engine has shut-off.

There is another possibility too involving some other mechanism that has gone awry and won’t feed fuel to the engine.

Thus, the fuel detection system might be completely correct, and yet the fuel getting from the tank or storage bank to the engine has gone afoul. The point being that there are more than just one means by which the engine might no longer be getting fuel, either due to the lack of fuel or the inability for the fuel to actually reach the engine.

The AI self-driving car should have various sensors and electronic communication going on with the ECU (Engine Control Unit), which then might aid the AI in diagnosing what is going on. This though can be tricky since presumably it has to happen in real-time. Suppose the AI is driving the car on a freeway at 80 miles per hour and all of a sudden, the engine stops running.

There is a need for the AI to try to quickly figure out what has gone amiss. At the same time, no matter what has happened, the AI needs to be taking action about the aspect that the engine is no longer working. You can debate somewhat about how much diagnosis is needed other than knowing that the engine has stopped. On the other hand, if the AI could discern why the engine has stopped, it might open more possibilities of what next action to take.

For more about the cognitive timing aspects, see my article:

For my article about the debugging of AI self-driving cars, see:

For the debate about driving controls, see my article:


In short, we assert that the AI needs to have a capability to contend with a situation involving the AI self-driving car running out of fuel.

Some AI developers would say that this is covered by their general approach of having the AI deal with anything that might go amiss with the AI self-driving car. In essence, they claim that running out of fuel is no different from say the self-driving car having a tie rod that breaks. It’s all part of the contingency driving aspects routine.

This is not really the case.

If the car is otherwise fully functional, other than the lack of fuel, there is a chance to use the in-motion of the self-driving car to try to cope with the situation. A severe mechanical failure is unlikely to allow the AI the chance of maneuvering the self-driving car and trying to find a safe way to get out of the current difficulty. We eschew this notion that all issues are the same and that there is no need to differentiate the various kind of issues that can arise during a driving journey and the functional capabilities of the self-driving car.

Some AI developers would say that the fuel running out is an edge problem.

It is not at the core of what they are aiming to get undertaken with an AI self-driving car. An edge problem is one that is at the corner or edge of what you are otherwise trying to solve. Yes, having the AI self-driving car be able to drive on a roadway and do so without hitting anyone is a core mission.

But, not being able to contend with things that can go awry, including the fuel issues, seems like a myopic view and will get true AI self-driving cars into hot water (along with endangering people).

For why AI developers sometimes take an egocentric view, see my article:

For the dangers of groupthink among AI developers, see my article:

For my article about edge problems, see:

For the burnout of AI developers and what then happens, see my article:

Let’s shift gears, so to speak, and now consider the situation of an AI self-driving car that is genuinely getting low on fuel and the AI knows that the self-driving car is indeed getting low on fuel.

What happens then?

Some auto makers and tech firms are skirting the issue right now because they are essentially controlling their AI self-driving cars, and so they force the AI to go ahead and do a fill-up.

In other words, if you are doing roadway trials with your AI self-driving cars, and you are pampering those AI self-driving cars with a dedicated team of engineers and maintenance personnel, the odds are that you aren’t going to allow the AI self-driving car to run out of fuel.

In the wild, once AI self-driving cars are prevalent, and in the hands of consumers, what happens then?

We’ll consider two different circumstances, one involving human occupants inside the AI self-driving car when the fuel question arises, and the other is the situation when there is no human inside the AI self-driving car. Keep in mind that AI self-driving cars will be driving around at times with no human occupants. It could be that the AI self-driving car is trying to get to location where it is going to pick-up humans, or maybe it is being used as a delivery vehicle and so heading to a destination to make the delivery, and so on.

It is predicted that AI self-driving cars will be used extensively for ride sharing purposes. If you owned an AI self-driving car that was being used for ride sharing, you might have it trolling around as it waits for a potential rider to request a ride. You look out the window of your office and see your AI self-driving car cruising back-and-forth, waiting for someone to request a lift.

Why not park the car?

There might not be any available parking, plus you might want your AI self-driving car to be the first to respond to a request, and if it is cruising around this might be a better chance than if it is stopped and parked.

For my article about ridesharing, see:

For my article about AI self-driving cars working non-stop, see:

For the potential towing of an AI self-driving car, see my article:


If an AI self-driving car has human occupants, and the AI detects that the fuel is getting low, should the AI let the passengers know?

If so, should it ask them if it is OK for the AI to then find a place to get more fuel?

It would be as though you are in an Uber or Lyft ride sharing car of today, and the human driver turned to look at you, while sitting in the backseat as a passenger, and let you know that the car is getting low on fuel. This is informative to you. The driver might then ask if it is OK for the driver to stop at a nearby gas station and fill-up. I’m betting you would be irked by such a question. You are presumably paying to get from expeditiously from point A to point B. Having to stop and have the car get fuel seems like a rather untoward act.

Suppose you refuse the request by the driver.

If the knows they cannot get to your destination prior to running out of fuel, it would likely that the human driver would insist that the car must be brought to a fueling station and whether you like it or not, the driver is going to do so.

Outrageous, you exclaim!

Why didn’t the driver beforehand make sure there was sufficient fuel to make it to the desired destination?

This is the same logic that some AI developers tell me about the question of fuel when I ask them what they are doing about fuel levels in their AI self-driving car software. These AI developers tell me that it won’t ever happen that the AI self-driving car will have passengers and be at a low ebb in terms of fuel. The AI is programmed to always be getting more fuel whenever it is otherwise not engaged with a passenger, plus, the moment that a passenger indicates where they want to go, the AI can ascertain whether there is enough fuel and thus refuse to take on the passenger if the fuel would be insufficient.

There are some holes in this logic.

Suppose that an AI self-driving car has taken on-board a passenger that wanted at first to go to the nearby park. The AI calculated the miles involved and figured out how much fuel there is and determines that it can make it to the park. During the driving journey, the passenger says that they need to pick-up their friend that also wants to go to the park, which means a side trip now for the AI self-driving car. Imagine that this then pushes the potential fuel consumption such that the AI self-driving car would not be able to get to the friend and to the park and then to a fueling station.

You’ve now got a circumstance of a passenger in the AI self-driving car and there is insufficient projected fuel to satisfy the driving journey being requested. Thus, this fanciful notion that the AI self-driving car would never have a passenger and yet get into a situation of not having enough fuel for a driving journey is shall we say weak.

My point is that there is going to be situations in which the AI self-driving car will have passengers and yet the desire of the passengers might exceed the projected available fuel.

The AI should be able to then interact with the passengers and explain the situation. Furthermore, there might need to be an interactive dialogue about what to do. In the case of the side trip to get the friend while on the way to the park, perhaps the AI explains that it needs first to fuel-up if the side trip is to be undertaken, and therefore “negotiates” with the passenger about what to do.

For conversing with an AI self-driving car to give driving commands, see my article:

For the socio-behavioral aspects of humans instructing AI self-driving cars, see my article:

For humans helping to teach AI self-driving cars via Machine Learning aspects, see my article:

For more about Machine Learning and AI self-driving cars, see my article:

You might say that it is unnecessary to have a dialogue with the passengers and instead the AI can just tell the passengers what is going to happen, regardless of actually what they want or doing any kind of interaction with them. The AI might simply tell the passenger that wants to go pick-up a friend prior to going to the park that this side trip is not going to happen. Without even particularly saying why, the AI might just emit an indication that the side trip is not being permitted and that’s that.

I have a feeling that if AI self-driving cars do that kind of dictatorial driving, people are not going to want to get into an AI self-driving car. People would probably be more willing to deal with a human driver than to have a “robot” that tells them what is going to happen and offers no capacity to try to explain or reason about the driving.

On the other side of this coin, presumably we don’t want a human passenger to let the AI self-driving car get into a dicey situation.

Suppose the AI tells the passenger that the side trip will mean that the AI self-driving car would run out of fuel and get stranded. The passenger maybe decides to tell the AI to go ahead and proceed anyway. Is the passenger crazy? We don’t know. Maybe the passenger misunderstood the AI and the situation. Maybe it is an emergency of some kind and the risk is worth it to the passenger? Could be various explanations.

Anyway, should the AI allow a passenger, a human, the ability to override what the AI has determined to be the case that the AI self-driving car is likely going to run out of fuel and become stranded, and thus the human is telling the AI that it is Okay to have this happen?

If you are the auto maker or tech firm, you likely would say no, never allow a passenger to override this kind of circumstance. If you are the human passenger and have some kind of rationale for why you want to do this, you are going to perhaps have a different viewpoint on the AI essentially overriding your command to it.

I’ll point out that this is not the only boundary of having the AI and a human at potential odds about a driving task.

There are other kinds of situations in which the AI is going to want to do one thing, and a human passenger might want to do something else. The industry does not yet have any clear cut means of trying to ascertain when the AI should so proceed versus acquiesce to the wishes of the human. It’s a difficult matter, for sure.

For ethical aspects of AI self-driving cars, see my article:

For the use of ethics review boards, see my article:

For the singularity of AI, see my article:

For my article about the weaknesses of AI common sense reasoning, see:


Let’s next consider the situation when there isn’t a passenger in the AI self-driving car.

This would seem generally to be an easier circumstance to deal with in terms of the fuel situation.

Suppose the AI self-driving car is trying to deliver a package across town and at first calculated it could make it to the destination without having to re-fuel. Turns out that the AI self-driving car got stuck in traffic due to a car crash that had blocked all lanes, and the fuel of the AI self-driving car got excessively used up due to this unpredictable and unforeseen delay.

I think we might agree that the AI self-driving car should as soon as practical go get fueled-up.

This is likely to create an even further delay in delivering the package. Presumably, the AI self-driving car is going to be interacting with some other system or people to let them know about the delay.

Overall, whenever there is not a passenger in the AI self-driving car, I would guess that we would expect the AI to be monitoring the fuel and take care to make sure to go do a re-fuel when needed. There would need to be a sufficient safety margin that whatever kind of unexpected aspect arises, it hopefully can still make it to the refueling.

Suppose though that the AI self-driving car is not able to do so, such as the case of being on the freeway and all lanes of traffic are blocked.

In that case, it could be that the AI has no options to do anything other than sit there on the freeway and use up fuel. Sure, it can try to minimize the amount being consumed, but let’s assume that it ultimately does run out of fuel and has no other means to do anything (it is stuck in traffic and no way to get out).

I know some AI developers that claim this kind of scenario is less odds than a meteor flying down to earth and striking the self-driving car. I’m not sure they are right about that aspect. My view is that betting on something never happening and yet that it could happen, seems like a lousy bet.

In an era of AI self-driving cars that are prevalent, presumably the AI self-driving car could use V2V (vehicle to vehicle) electronic communications to let other nearby AI self-driving cars know that it is getting low on fuel. This might then get the other AI self-driving cars to help out and open a path for the AI self-driving car to get off the road.

I suppose that if the AI self-driving car did run out of fuel on the roadway, perhaps the other nearby AI self-driving cars might come to its aid. For example, another AI self-driving car might give the stranded one a push to help it get out of traffic. Other AI self-driving cars might block traffic to let this activity take place.

Meanwhile, via perhaps V2I (vehicle to infrastructure), an electronic message goes out to a local tow truck that it should come and get the AI self-driving car.

For my article about swarms of AI self-driving cars, see:

For federated Machine Learning and AI self-driving cars, see my article:

For the Turing test and AI self-driving cars, see my article:


The Kepler space telescope recently ran out of fuel, bringing to an end its nearly 10-year planet hunting voyage across outer space, having successfully discovered several thousands of exoplanets.

Away it now drifts, looping around the sun and taking on an Earth-trailing orbit.

NASA knew that the Kepler would eventually run dry.

Closer here to home, we should be thinking about the situations when an AI self-driving car goes dry, i.e., running out of fuel. It seems like perhaps a trivial matter in comparison to hunting for new planets, but I assure you that once we have a prevalence of AI self-driving cars, people are going to want to know that those AI self-driving cars can deal with being low on fuel, along with being able to appropriately handle situations of running out of fuel.

I know it seems counter-intuitive that these seemingly “super powered” AI self-driving cars could somehow find themselves stranded, and we would assume that running out of fuel could only happen to distracted or imprudent human drivers, but these kinds of assumptions need to be revisited.

Maybe we’ll see an AI self-driving car that one day is pleading for fuel, hey buddy, the AI asks another nearby self-driving car, can you spare a gallon or two (or, perhaps some megawatts)?

Copyright 2019 Dr. Lance Eliot

This content is originally posted on AI Trends.

[Ed. Note: For reader’s interested in Dr. Eliot’s ongoing business analyses about the advent of self-driving cars, see his online Forbes column:]

Running Out of Fuel And AI Autonomous Cars 2

Read More


Please enter your comment!
Please enter your name here