Let's face it. 99 percent of us have no clue how encryption works. As a programmer I understand some of the inner works, but they are too complex for me to implement myself. Security is hard even for experts, it is very hard for me. But think about the people who are not versed in technology but have to pass laws for it.
It's common to hear on the web that the people who are in charge of making laws are not tech savvy enough to make those decisions. And quite often it is true. Just look at the whole mess Europe is going through with cookie laws. Having to ask user permission to set cookies does not protect them. No one in their right mind reads the warning anyway. They simply click on whatever that will close the popup.
The reason we have cookie laws today has more to do with with the fun name "Cookie" than the law. But privacy is not fun. It's necessary and we need good laws to protect it.
Not everyone can pick a lock, but if they really wanted to they can head over to YouTube. Despite this, we still have doors with locks guarding every house because the law still offer us some basic protection. I bet if you pick a judge at random, he will probably not be able to pick a lock. Even if you give him the tools, he will have a hard time trying to get the door open. But nevertheless, the same judge can easily determine if a crime is considered breaking and entering or not. And for the most part, we will not question his judgment.
You don't need to be an expert in lock picking to pass those laws. But when it comes to security in technology, we expect the judge to be fully versed to be able to rule.
I argue that just like the breaking and entering laws, the judge does not need to be an expert to pass a judgment. Technology is complex in the eyes of most. It truly is, but when we look at the fundamental goals we are trying to accomplish with it, it becomes not so hard.
Self driving cars have been the craze these past few years, and there has been a lot of noise when laws are passed to make the cars safer. One example is the proposed law of requiring a driver in a self driving car. Here is how they put it on techcrunch:
In what is sure to be seen by some as government interference and general misunderstanding of technology, the California Department of Motor Vehicles has released a proposal that would require drivers to be present in self-driving vehicles in the state.
It is automatically seen as a government interference. Let's think about why it is seen as an interference.
The car world is a very tight business where it is nearly impossible for a new player to join. (Tesla is a fluke). How often have you seen a car that follows a completely different design? Any car that drives on freeways and highways has 4 wheels, a driver side (left or right depending on the country), a steering wheel, rear view mirrors, and the standard driving pedals. If you got into a car and it was missing one of those key elements, you would certainly panic. But let's not forget that these elements and designs are the byproduct of the infrastructure we built yesterday. These are the things we require of a car today on roads we made yesterday.
But how about the car of tomorrow? The car that drives all by itself. Why do we need a steering wheel when human interaction is not required? Why do we need a driver seat when the real driver is an AI in a box that can be tucked in a compartment? This opens up space for new innovations. It allows for new sitting arrangements to be made. It's not surprising that Google was first to fight the law. Just a year earlier, their design for a self driving car came with no steering wheel attached.
2014 Google Autonomous vehicle
This 2015 Californian law becomes an interference with their work in retrospect. Does it mean that lawmakers are just conservative folks afraid of any change? Do they fight technology simply because they don't understand it? After reading the actual proposal, and doing just a little bit of research, the answer is no. There are very good reasons for this law, and of course the media only talked about the controversial parts.
First here are the three other proposed point that were not covered in the media:
Manufacturer Safety Certifications and Third-Party Vehicle Demonstration Test: This would require every findings and demonstration of autonomous vehicle to be validated by a third party before being accepted. Isn't that how it's supposed to be with all science?
Provisional Deployment Permit with Ongoing Reporting Requirements: Reporting real world data of the performance of the vehicle. This is data will be used to evaluate safety and to make sure the cars are doing what they are supposed to. We have to remember, self driving cars are new, we need to see that they actually work in real world conditions before trusting them.
Privacy and Cyber-Security Requirements: Privacy laws should protect the operator of the vehicles. All this data will be sent to mother-base, so the manufacturer has to assure security of the collection.
In addition to requiring a driver, these laws are pretty well thought of. And they only require common sense to draft. These are problems we should all be concerned about whether we know how a microchip works or not.
Now let's see why it is required to have a licensed driver in the vehicle while it is operational. Let's start by looking at this statement on Google self driving car page.
Where we are
We’ve self-driven more than 1.5 million miles
and are currently out on the streets of
Mountain View, CA, Austin, TX, Kirkland, WA
and Metro Phoenix, AZ.
These are all places with ideal driving conditions, and sunny weather for the most part. Introduce the a self driving car in less than ideal condition and you get the Volvo incidents.
A rainy day, a foggy day, a snowy day, or any less then ideal day and you have all kinds of problems that rise with the self driving car. Hopefully one day we will solve all these problems. But for now, the laws being passed by not so tech savvy people are common sense. Just because they don't know the inner working of an A.I doesn't mean they don't understand the standards the current autonomous car has to go through. At least for when this technology is new we will have to carefully transition into it. The media is pretty good at exaggerating the facts but it doesn't hurt to have a human driver in the front seat in case things go wrong... for now.
As much as we like to praise the advancement of technology, there are some pretty rough edges. Specialized A.I like AlphaGo or a chess playing software are better than humans in those fields. But take them outside in the real world, faced with general purpose tasks and they are quickly overwhelmed. These systems are not perfect and can sometimes be fooled.
Having a Ted Cruz talk gibberish about tech is dangerous. It is not the same as a judge taking a side on the ruling of a case. There is always political process that tries to enter the field, and have lobbyists push for laws that benefits their employers. These should obviously be illegal. But saying that someone who doesn't understand a gadget shouldn't pass laws is just as dangerous. For what we know there is not a single human that understands every part of a modern machine.
But knowing the purpose of the machine should be plenty to decide whether it is in the benefit of society or its detriment.
By the way, I only used self driving cars as an example of tech. You could easily switch it to one of the many misunderstood laws of the internet. I love self driving cars and can't wait to test one on a beautiful sunny day in California.