Software Governance and Automobiles - Session 2

Is There a Consensus Around Cars and GPLv3?, Panel Discussion with Mishi Choudhary, Daniel Patnaik, Mark Shuttleworth, Eben Moglen, Nicholas McGuire, Jeremiah Foster.

MISHI CHOUDHARY: Okay, thank you. Now is the time to grill all these gentlemen. If Daniel wasn’t here, we would all think that we’ve all solved the problems. Everybody is very enthusiastic about GPLv3 in cars and we have solutions to all the issues–that’s what’s happened from the paper and what everyone said. But, that which we call a rose by any other name would not smell as sweet here because what Eben and Mark call innovation, Daniel calls them user-made, maybe, manipulation, and so humans driving cars is already a complicated process, and now we’re moving to autonomous vehicles and a limiting factor, obviously, is always safety. So, there are already so many complications, now you want to add GPLv3 and give people exactly what? The freedom to tinker with the car? So, I want to ask you… Are you all in agreement that there is no future of cars without free and open source software? And I want you to talk about that agreement which obviously has a lot of disagreement built in. Daniel?

DANIEL PATNAIK: Yeah, that’s what I wanted to point out during my presentation. I think there is a future of open source software in cars. This is a fact which I can see everyday. So, there is already open source software in the cars, and there is definitely a future. I remember and I just mentioned that–when we were standing together–some years ago, some people said, “Okay, we want to block that entirely,” and I said, “Hang on a minute, we cannot and we should not do that, and this is not the way forward,” so I was always encouraging people to take a very precise look at what we are talking about so we can enable, we can show the boundaries and enable software innovation to get to the cars within the boundaries that are important to us.

CHOUDHARY: Mark, what is trusting software? It’s not just knowing the provenance of software, but it’s also about what you talk about in the paper–about how software governance is managed. Daniel also talked about partitioning, so can you talk a little bit more about, in that context, the future of FOSS in cars, and how you see it?

MARK SHUTTLEWORTH: I guess I’m reminded of that old 1980s Cold War, “trust but verify,” and I think, with hindsight, that was a pretty savvy view, that ultimately trust isn’t a simplistic thing–it’s best if it’s backed by science, it’s best if it’s backed by facts, and possibly if it’s backed by teeth as well. So, why do we trust something? Because we believe it will be a predictable outcome–we believe we can predict the outcome, and I think what I observe in the industry is that we’re going through that gradient of going from trust is the sort of nebulous thing at a very high level that’s almost tribal and branded, you know, and now we’re getting down to a sharper, pointier more, almost more, useful definition of what do I need to trust and do I or do I not trust that for that, effectively.

CHOUDHARY: Daniel, you want to jump in and also talk about how you think this trust plays out, how the car really works, what you lock-down and what you keep open?

PATNAIK: Yeah, I think also here we have to look where we need trust. I think, if you look at the overall car, of course not only we want to have trust that everything works well but also the customer, of course, wants to have trust that everything works well. At the end of the day, also, the trust is part of the permission of the car and I think that this is one of the key issues here that are regulatory side as well. So, as long as they say there’s trust, and also from a regulatory side, there is a permission to it. However, I think we have to distinguish, and I don’t have a clear answer on that right now, but we have to distinguish where we really need trust and maybe we have more of a freedom to back up a little bit and say maybe here, in that area of the car, trust is not so necessary but in other areas trust is very, very important.

EBEN MOGLEN: Yes, the difficulty here is that trust is a different concept when you’re not bending metal, you’re making software. The way that the vehicle OEMs got trust in the physical automobile was by saying, “Please use only General Motors replacement parts, please use only our approved spark-plugs… Please this, please that,” and the idea was that somebody manufactures the trusted thing and then there’s a whole bunch of people out there who manufacture untrusted things. Please use the trusted objects and then your car will perform correctly, that’s not a twenty-first century concept anymore. Now, we have the problem of no software is ever perfect, therefore the idea that what you do is you manufacture a TiVo-ized car, you put some software in it, you lock the software down, and that software works until the car dies is never going to be correct. It’s never going to work that way. The problem with the idea of TiVo-ization is that it establishes trust at the moment the car leaves the factory, and now you are trusting all the defects in the software for the life of the car and nobody really believes that. Therefore, we are talking about an environment in which we’re going to have to have software-replaceable parts, and we’re having a discussion about who we trust to make and replace those parts. This is, in the end, trusting people not trusting software–trusting software is just a reflection of trust in persons.

What the re-organization of Volkswagen reminds us of this week, yet again, is that the idea that the only trusted persons are the manufacturers is also not going to be correct in the twenty-first century. We talk about this highly regulated market, but we now understand that the Volkswagen case was extremely useful in this, too–that regulators are not going to find the problems in the software, civil society is going to find the problems in the software. This is 100% guaranteed to be true; mathematically, it’s true. The regulators will never employ enough people. There are not enough taxes in the world to employ enough regulators to check all the software–civil society is, therefore, going to be responsible for inspecting and discovering failures in software. That means FOSS by design because otherwise we’re using unsafe, uninspectable building materials. And now the problem is so you have inspected and you have found a problem, then what? You write a letter to the automobile manufacturer and you say, “You guys ought to get around to fixing this one of these days?” You write to the National Highway Traffic Safety Administration and say, “I found a bug, would you please recall all these automobiles for me?” None of the existing mechanisms will work with respect to what is going to be the most complex, the most dangerous, and the most widespread bunch of software in civil society. We’re going to have to figure out ways to govern repair, modify, and use that don’t depend on trust in a brand on the side of spark-plug box.

CHOUDHARY: Daniel says take the principles, what FOSS teaches you, but not necessarily the license itself because openness comes in and all regulators would like some throat to choke when there is a problem, so…

MOGLEN: Yes, and it won’t be a legal throat to choke, it isn’t a copyright lawsuit against somebody. It’s a technical set of facilities that operate software in vehicles in such a way that regulators, users, manufacturers, parts manufacturers, and third-party service entities can, together, optimize the mix of software in the vehicle at any given moment, given where it is and what it’s doing, and that’s going to turn out to require more sensitive mechanisms than either “free-flier” zone, it’s all open, sometimes it gets fixed at annual check-up or fifty-thousand miles or a hundred thousand miles, or every Johnny and Sally makes whatever changes she wants to her automobile–neither one of those are going to be acceptable.

Somewhere in between there has to be a way of doing that more sensitively, and that has to be not a legal set of rules, it has to be a technical set of rules, supported by law where we can use contracts and copyright law and other legal machinery to keep everybody to it, but without acceptable technical solutions for the very complicated problem of governance, as a technical matter, we’re not going to wind up with what we want.

If automobiles are TiVo-ized in the twenty-first century and nobody can change the software in them but the manufacturers, the manufacturers are going to wind up very unhappy… Because they’re going to be responsible for a nightmare of liability problems as software ages and conditions change and they’re the only people who can fix it. One of the reasons that I think it’s so important to talk about these liability issues is that I think at the moment the automotive manufacturers think that the best way to avoid liability is to control it all themselves, thus piling up all the liability in their hands, and I think what Mark and Jeremiah and I are all saying in different ways is that’s not the right long-term solution. It doesn’t optimize innovation. It doesn’t optimize liability protection. It requires you to be vertically integrated servicers of long, lifetime safety critical software forever. Are you really sure you want to be in that business? So, my question is, does your client really want to be in that business? Does your client, Audi, really want to be in that business of centralizing in itself all liability for software problems forever and being the only point of repair for TiVo-ized software in cars?

PATNAIK: Good question. I think, of course, to a certain extent we want to and have to control a big bunch of it as long as we don’t have real solutions because liability is probably two-fold. The liability, as you mentioned, I can understand, and I see that point as well, but also, from a product liability side, the government or the courts they have also some obligations on the manufacturers–you have to look at what is been done with your car, with your car system, with your software, and if you see that someone is doing something you will also have to control this or, at least, do analysis of that and you are also responsible in a certain way to ensure that whatever is being done with your software is safe. So, this is the context we are in. So, of course, the question is do we all have to be–do we all want to be–liable for everything? No, probably not. If there’s a way to divest that, of course we will be open to do that, but, I think, we are not yet there. We have to get there.

MOGLEN: So there are two things we could think about with respect to that. The first thing is that the most dangerous thing that a human being can do to modify their car is to make uneven the inflation pressures of their tires, and nobody would ever say that General Motors is going to be responsible because Jimmy decided that he liked fifteen pounds per square inch in the left rear tire and forty-five pounds per square inch on the front passenger tire. The resulting wrapped-around-a-telephone-pole experience is not regarded as the manufacturer’s problem. There was a user modification, and it was deadly. It’s way more dangerous than screwing with the VLC that plays the entertainment video in the backseat for your kids so that the volume control will never go past four, which I predict will be a popular modification, right?

We need to understand the scale at which what we’re trying to do is figure out what forms of software modification are actually not a problem after we have given the manufacturer tools that allow it to control the stuff that really is a problem quite heavily while not controlling that which is not, and no license can do that, no bunch of legal words can do that, there has to be a technical infrastructure, connected with the way software is put together and distributed, that gives us that.

What Mark is saying about Ubuntu core as another addition of the software is that the people who have been most innovative in distributing FOSS in the last generation, and who’ve changed the software industry around the world by doing it, are now concentrated on that question… Because of IoT, because of the automobile, because of all that complex stuff at the edge, we’re now going to learn to package and distribute software with all of that kind of sensitive control…

I want to find a way of bridging the remaining legal difficulties, whether it’s GPLv3 or your concerns about how LGPLv2.1 works or–I want to take all of that legal material and re-shape it just a little bit around the edges so that we can understand how it works compatibly with new packaging structures. To give manufacturers fine-grained enough control that they can relax their concern about user modification.

They live with the fact that you can’t control tire inflation pressures from the moment the car leaves the factory. They know that there’s no TiVo-izing the pressure valves in the tires, they understand that there’s no way that every single thing can be controlled in the interest of safety and in the interest of liability limitation. But there are obvious things that we would like to be able to do, including to have a computer in the car which constantly monitors tire pressures and that puts a note up on the dashboard if something is wrong, right? Which is software that we might allow people to modify but we might also allow them to modify it only in certain ways, and we would certainly want to have control over provenance. You don’t want me modifying your tire pressure gauges in your car with an over-the-air modification, and this, again, is one of the things that Mark and I are trying to address in the paper–to explain how we can use digital signatures and blockchain publication and other things so that everybody, NHTSA after a crash and third-party manufacturers and police investigators, can all know exactly what was the software in the car and who put it there.

That’s going to be a critical part of trust–and a critical part of law–in the twenty-first century. If somebody made a modification to a Tesla’s auto-pilot software and it wound up in the middle of a median divider on a highway at seventy-five miles an hour, who changed those bits is going to be a very important story.

PATNAIK: I fully agree, Eben. I fully agree with your point of view. We have to find ways to be able to have a differentiated view to certain things, and we have to think about how we can get there. I think I put up some ideas about how we could get there. I think you, Mark and Eben, have also showed a way. I think we have to be open, and I would like to–I think it’s a good discussion to be open-minded to understand how we can get there in order to accomplish all of that.

MOGLEN: Maybe we should see who else has questions… Mr. McGuire?

CHOUDHARY: Sure. Nicholas.

NICHOLAS MCGUIRE: Before you go into user-modified–or the problem of user-modified–what is the expected modification rate of the OEMs? And that is so extraordinarily high already that you, with your current model, can’t even handle that, and that’s why I think the discussion about is it the traditional model versus the user-modified model is actually the wrong discussion. The discussion that we need for the automotive industry is the traditional, “I control the software,” versus “I have a highly dynamic software that I’m going to be updating probably something like every two weeks, once I have the complexity of an autonomous vehicle,” and if you can solve that problem, extending that for user-modifiability will be significantly easier. But as long as we discuss from these two very far apart sides, I don’t think we’re going to get close.

PATNAIK: It’s difficult to answer because I didn’t even clearly get the question, but of course we have these two-sided views… How I understood yourself was that we should combine the two things, but… Yes, I think, of course, there will be updates and probably even regular, if you see it with an everyday device–everyone is using today, there will be updates every second day, every day, and this will happen, also, with cars in the future, the more software will get incorporated in our cars… Still, however, I think we should–and we have to differentiate a little bit about what the individual user does, coming back to Eben’s example: if the customer modifies his individual car and his tire pressure, of course he can do so and it’s his own risk and I don’t want to prevent him, though I would like to in some cases, but this is his personal decision to do so, but if he then makes this public and gives this solution to other people, then I think we are in a position and we at least have to not control it but at least know that and have a position for how we react to that.

SHUTTLEWORTH: There’s a reference earlier, sort of tangentially to regulation and liability, and what’s interesting for me is that this narrative often plays out as a contest of wills between the private individual and the institution and the institution’s commercial interests, but the really interesting cases are all, typically, regulated. And so, the balance of interests is much more complex than just a private individual and an institution, and, I guess, in a sense all of our interests are represented in the regulatory function–all of our interests are represented in civil behaviors and decisions.

I think it’s really important that we figure out mechanisms to represent those stories. We may well come to the view that actually that is very helpful to manufacturers because it essentially starts to establish the limits of their control or the limits of their expectation of control and, therefore, liability. Anything that essentially is on public roads or anything that is essentially in a public environment becomes something like a shared responsibility and having a clear limitation on what you’re expected to enforce, potentially, is helpful in the bigger picture.

MOGLEN: Mr. McGuire’s point that the amount of modification occurring in the software in vehicles is going to be extremely high, all the time, I think is unanswerably correct, right? Once we are talking about software doing the driving, it is going to be updated all the time, to take account for all kinds of experience that was unexpected, and even before we get to that…

Look, there is an argument that the anti-lock breaking software, which we all want to think of as, you do it once, and you do it right, and you never let anybody change it again, really ought to be changed according to weather conditions and all sorts of other subjects and that we really ought to want a high degree of software volatility inside the vehicle, but without strong governance principles, including the ability to roll back halfway, we’re going to wind up in a world where automobiles that can kill people are no more successful completing their updates than Windows 10 objects, which, after all–pardon me, Justin, it’s hard for me to imagine that I have friends in Microsoft, I need to be careful about what I say about them in public, but let’s face it that even in a comparatively simple environment called, “one device we provide the operating system for,” high dynamism in updating is a terrible, terrible, problem and we need better tools for it.

I also think that something Jeremiah said which is critically important at this moment, when we talk about regulation. In the world that the governments are now looking at it is the data generated around the car which is the greatest and most important subject. All this other stuff is comparatively traditional.

Now, there are two ways of thinking about that: one of which is that all the data generated around the vehicle is going to regulated and government controlled and the other way is my way, which is that better not happen. And one of the most important elements in what users of automobiles and other vehicles and autonomous systems in the twenty-first century are going to want the power to modify is the leakage of the data. I’m okay with my car having as much tendency to be chatty about who I am and where I’m going as is minimally necessary in order to achieve certain agreed upon social goals, and after that…? Right?

I mean, I live a reasonably effective life in the net without a Google account, without a Twitter account, without a Facebook account, without a platform relationship of any kind. Please don’t tell me that in order to own an automobile in the twenty-first century I’m going to have to be more risky with my personal data and the substance of my life than I am already.

And that surely means that there are going to be levels of desire for user control over the way software in vehicles work which are extremely valuable to the individual, extremely important to manufacturers and service platforms, and extremely interesting to government regulators. The rules about all of that have to be adaptable. We have to be able to have that social policy conversation in a serious way, and without some kind of technology for governance of user-modifications of the software in the cars, we can’t have the conversation at all.

This is why Nicholas, from my point of view, it’s not only about the question of the dynamism of the software environment. It is, in the end, also about who has rights… Because I think the rights package that was involved in the twentieth century automobile, which was basically the open road and the freedom of people, which the automobile came to stand for, had better not be the opposite of the twenty-first century meaning of the package–that the automobile is a form of social control for whoever owns it, runs it, services it, manages it, not for the person who we used to quaintly think of as the driver of it. And from that point of view, it seems to me, how the software is governed and how it is updated, and who has the right to update it, is going to be terribly important to all of us. Of course I’m concerned about my safety. Of course I do not want my brakes to fail when I pump the pedal, but I also don’t want the automobile ratting me out every place I go to people I can’t do anything about.

AUDIENCE MEMBER: If you’re still pumping your pedal, you’re doing it wrong!

MOGLEN: Oh no, on the contrary, I don’t trust anti-lock brakes on an icy road, and that’s an example of software failure that I have experienced in my life from time to time. Of course I pump the brakes. Tough shit if the software thinks I’m not going to.

CHOUDHARY: Other questions?

AUDIENCE MEMBER: Yeah, I’d like to make a couple of comments pertaining, I guess, to the auto industry. Was it Ralph Nader who wrote the book, “Unsafe At Any Speed,” about the Corvair and the garbage that General Motors produced many years ago and, apparently, continues to produce today. I’m wondering about when the cell phones came along and we see a spike in auto accidents and followed by auto deaths, and I believe that last year auto deaths in the U.S. were approximately thirty-four thousand, so I wondered with the automotive manufacturers loading up distraction device after distraction device on the dashboard where you can watch a video, tune into the internet, and, generally, get distracted… So the auto deaths… You know, do the manufacturers really give a damn? And I’d say not really. And, then, in terms of the auto industry moving to Mexico, South Korea, and China, and then when you look at the J.D. Power’s quality control study of autos, I think there’s one U.S. automaker in the top ten–fortunately, Audi is in the top ten along with Lexus, Toyota, Honda, Nissan, etc… So, I guess my last point is to Daniel: Can you comment about the fact that BMW and Mercedes have recently announced a joint venture–I believe it’s really to counter the power of Google, Tesla, and Apple, where the automobile becomes the software machine on wheels, and the Germans don’t want to be squeezed out of the business by Silicon Valley. So, over to you, Daniel.

PATNAIK: Yes, thank you. It was a bunch of remarks and questions, but maybe to start with the last, I’ve been talking to the big players in Silicon Valley as well because I’ve been legally involved as well, and I know from our departments that we are, of course, doing the same. We are trying to… We understand that there are interests to come up with software solutions, we’re trying to match and match those interests with coming up with our own solutions–it’s either in the automotive industry with other automotive companies or within the group where we have more than thirteen brands, at least thirteen brands, at the moment. We have a big power and an interesting power in order to be competitive to the others, so we don’t have to fear it, we have to watch that very closely.

But you mentioned also in the very beginning a part which said that distraction in the car is of–you mentioned that the producers don’t give a damn, and, on the contrary, we give a lot of it. We know, in my department, at least, we have engineers that care a lot about product liability, and I work with them very closely, and distraction in the car is of a high importance, at least in our company. So, we don’t allow things to–movie, as you said–we don’t allow that. So, whenever the car starts moving, everything is shut down. Of course, I cannot prevent a customer to put his phone somewhere in his field, to attach it somewhere and to watch that–I cannot prevent that, but we are even looking at it and we’re trying to see if there is a cable connected to try and stop that. But this is a very wide field that we can discuss about for a long time, but we care a lot about those issues–the customer to not get distracted by whatever is there.

JEREMIAH FOSTER: Many car companies, I think, all care about safety. I think they care a lot more than we imagine because I think the message that gets out to society is much more commercial, it’s much more selling, and I think that they realize they need to adjust their message. But they absolutely do, and, in fact, car companies like Volvo, that is their differentiation–safety is the differentiation. They invented the three-point seatbelt, for example, and they are going to try and go this approach that Eben was talking about. You know, the CEO of Volvo says that they stand for the liability of their autonomous vehicle systems.

Now, that’s really good. We want that. We want, as consumers, somebody to choke when somebody goes wrong. We want to hold their feet to the fire, but that car is going to be built with FOSS, and they are going to use GPLv3, and we’re going to have to have a way to make sure that’s all done the right way, and that’s obviously the topic of this event, but it gets back to the point about safety. Who is responsible? How’s that going to be done? And a lot of this autonomous driving, though it presents a dystopia where the car drives you to jail or what have you, the fact is that small amounts of that are going to save lives, but it has to work in combination with infrastructure as well. I mean, there are things that we can do today that we’re not doing, like making traffic systems–the roads–safer. That’s done in Sweden, it has the lowest death rate per kilometer traveled. I know New York City has talked with Sweden but, you know, every country needs to do that. That’s a big priority, and it’s not on the car makers to do that work.

AUDIENCE MEMBER: What can we learn from the custom car movement in the 1950s and 1960s? I mean, we used to modify everything, all the time. You know, it was not considered cool to be driving a standard version of a car. So, how can that inform this? Thank you.

PATNAIK: I think what you’re mentioning, or describing, is legitimate, and I think this is something we will see, also, in the future. So, the customer has, and should have, the freedom to do exactly this for his own product. If it comes to that this is something which can be and should be used in a more wide space and a more open space, then of course there needs to be a check somewhere that this is matching the overall security standards and safety standards.

AUDIENCE MEMBER: I was thinking more of a Dodge Mopar–in other words, there are models for this… In the past…

MOGLEN: So, what did we learn? I think we learned two things. I think the first thing that we learned is that the industry benefited from user innovation substantially. It picked up a lot of tricks from people in the street over the years. It picked them up with respect to design. It picked them up with respect to the forms of fashionable operation, whether low-riding, high-riding, loud mufflers, not-so-loud mufflers…

But the other thing that we should learn from it is that the automobile was an extraordinary technical university for the human race. There are people all over the human race who learned things about technology and who learned how to make a living by working on cars. Cars were a vehicle for the education of people in the technologies that cars contained, and that knowledge flowed out into vernacular technical cultures of all kinds around the world.

This was a lesson we learned in the free software movement, right? I mean, that’s why I said two decades ago that free software is the greatest technical reference library ever assembled on Earth because it’s the only way that a person without skill yet in the art can learn from the very best that is being done in all the ways that are being done just by reading stuff you can get for free.

We want the automobile to continue to be the seed of vernacular technical education in the world. We can’t do that if people can’t modify the car.

This was the point about GPLv3’s anti-lockdown provision in the first place. We were trying to preserve what we understood to be the way people in the world became technically highly capable–namely, by hacking on their own things, and we did not want the level of things in the world locked down that young people couldn’t learn to program on to go up too high. That always seemed to us a global north-south issue. In the north, there were lots of people, if they bought one computer that was locked down they could by another computer that was not locked down, Linus Torvalds was a good example–it didn’t bother him, he didn’t need to worry about it. He could buy another computer. But all over the world, there are people who have exactly one, and it would be good if they could hack on it because that’s where they’re going to learn.

This should also be true about the car in the twenty-first century because we saw how important it was in the twentieth. So, for me, the stakes in the modifiability of the technology include all the human learning that will flow from that, which is a vast welfare lost to the world if the machines are not willing to allow people to learn from them, and that means ability to read and to understand but it also means the ability to experiment.

Sure, it’s more dangerous when it’s an object that travels at high speed, and, therefore, it would be really good if we were clever about it like the GPS in the car tells the software agent, “No more modifications now, he’s on a smart-road.” He has to be totally in-sync with all that complex built environment around him–or, “nah, he’s in the middle of the back of beyond on his own real estate in a rural county, let him do whatever the hell he wants to do.”

And to have the ability to move back and forth between a highly regulated software state on a smart road and a less regulated software state somewhere else–all of that lies within our existing technical capacity. So, if we have the technical capacity to do those things, we should… Because what I think we learned from the twentieth century history of the car was that technical enablement was really valuable to people–it made an enormous difference in peoples’ intellectual and economic development and in their lives.

As I’ve traveled around the world in the last lifetime of mine, I have certainly seen an awful lot of things that were done by modifying cars that car industries learned from and, more importantly, that people learned from, and whether it was on a Caribbean island where everybody was using left-drive cars for right-side-of-the-road driving or whether it was the adaptations of the self-worked-out propane conversions in countries in the global south or whether it is the miracle of the auto-rickshaw–we may not want to ride in them but it’s a miracle that they exist and they stay on the road decade after decade with guys doing fixes at the end of the day and putting stuff together with spit and baling-wire–all of that comes from the history you’re talking about, that the automobile was a highly sophisticated, very complex, but also very enabling technology that people interacted with in a whole bunch of ways that we now call hacking, and it worked really, really well.

SHUTTLEWORTH: I’m sorry, you described a brand–was it like a Dodge…?

AUDIENCE MEMBER: Mopar.

SHUTTLEWORTH: Mopar? And was that sort-of a modified…?

AUDIENCE MEMBER: It was called an after-market… It was an entire after-market industry…

SHUTTLEWORTH: Right. So I have to ask because I’m a lot younger than I look, and my memories of the 1950s and 1960s are entirely manufactured from watching movies made before I was born, but my impression is that this was the first time when pretty much every family got access to a car and cars were super cool–it was still just a little exclusive but not really that exclusive, and what you’re describing reminds me so much of the importance of tapping into passion, tapping into enthusiasms, and this is true for every brand. It’s easy to forget once you’ve become successful, you think that it’s you that makes you successful, but it’s not really, right? It’s peoples’ passion for what you mean to them and so on.

So, we see this, in our little way, in the existence of derivatives of Ubuntu, right? People who have different passions to us but it’s easiest for them to express those passions starting with Ubuntu, and we just grant them the rights to do that because it costs us nothing and the reality is it’s interesting what they do–it’s much more interesting what they do, often, than anything I might do in a day, and it generates enthusiasm, it generates activity.

I can see real value for that in, for example, a car or another object manufacturer being able to say, “Look, as long as I can bound these things, I don’t mind allowing the creation of a Mopar,” right? An enthusiast’s sort of derivative, effectively. As long as I can bound the pieces where I may have a regulatory issue, effectively. That is actually a huge asset to me because all of that time, all of that energy, all of that thinking, is effectively much more directly applicable to me than it is to any of my competitors, and so, we may well see–you know, as soon as we have the ability to start drawing these distinctions–that that kind of fan club enthusiasm is both enabled and encouraged.

AUDIENCE MEMBER: Can I just ask which regulator has agency in all of this and what’s being done to ensure that there’s more consistency across regulations in the various sectors?

FOSTER: All of them. Yes and no. I don’t think that there’s a single governing body. In fact, part of the issue is that you have, for example, right to repair laws in Massachusetts that don’t necessarily exist in other states. You have CARB, the California Air Regulatory Board, which basically sets policy for the United States when the federal government is not fighting them doing so, and then you have jurisdictions across the rest of the world, which may match or may be completely opposed, and then you have governments that both want to create new regulations for new income streams as well as preserve a differentiation or an opportunity for their own automotive industries to be competitive, so, yeah, that kind of harmonization I don’t see it existing any time.

PATNAIK: I just added, I don’t see that as well. They’re all multi-national. Every national authority has its own view to that, and I don’t see that there’s something being done in order to bring everything together, but I would like to see the eyes of the regulator in whatever nation we look now. If we tell him, “Okay, we have produced and certified a car, and we have allowed the customer to do whatever he wants with the car and he, by the way, he’s just driving down the road here.” I want to see the eyes of the regulator, so I think there’s a lot to do and a lot to discuss on the side of the regulators to also to make them understand the issue of FOSS being used in cars.

CHOUDHARY: If there’s a regulator in the audience, this is your time. Other questions?

AUDIENCE MEMBER: Yeah, quick follow up for Jeremiah. Jeremiah, you mentioned the safety of Volvo, well, as you well know, Volvo is now a Chinese company that has said that they’re going to switch over to electronic vehicles completely–is it in 2022 or something like that? So, again, we have the Chinese to look to for a global technology leadership in a green environment with less CO2 put into the atmosphere. Is Mr. Trump listening? Less CO2 into the atmosphere because we don’t want to have to move to Mars.

FOSTER: Yes, I think Mr. Trump has other concerns at the moment, but yes, I think there’s great concern among states for environmental health of their people. I think that’s what drove American regulators. I think that’s what drives German regulators, Swedish, etc.–huge issue.

CHOUDHARY: Other questions? I think it’s lunch time. These are important issues. Software governance is definitely not sufficient in its current form right now in automobiles, but we have an entire afternoon of interesting presentations, so now we’ll move to lunch and we will reconvene at 1:30 PM to ask further questions and grill these people. Thank you.

Previous: 1d-foster | Next: 3a-adams | Contents