We've been told all along that self-driving cars are safer that today's human-powered vehicles. By eliminating human error, these autonomous vehicles are expected to drastically reduce the number of auto accidents on public roadways.

What you might be surprised to learn, however, is that some of these cars are being programmed to break the law by speeding. How can that be safe?

As Google lead software engineer Dmitri Dolgov told Reuters during a recent interview, their autonomous car is programmed to stay within the speed limit most of the time. But when the traffic around you is exceeding the speed limit, it can actually be dangerous to maintain the slower (legal) speed.

Under such conditions, Google has programmed its self-driving car to go up to 10 mph above the speed limit to better keep pace with traffic.

While I can certainly see the logic behind the decision, it also opens up a whole new can of worms as it relates to self-driving cars. For example, if you're caught speeding in your autonomous car, who pays the fine? You? The auto maker? The software company that programmed the car? Is it simply waived off?

Along the same line of thinking, who is liable in the event of an auto accident involving a self-driving vehicle? Maybe it's the person in the non-autonomous car? One could argue that it's their human error that caused the accident in the first place.

As you can clearly see, there are still a ton of unanswered questions as it relates to autonomous cars that'll eventually need to be figured out before the technology goes mainstream. Let's just hope these legal issues don't end up permanently sidelining the movement as a whole.