- within Food, Drugs, Healthcare, Life Sciences, Transport, Government and Public Sector topic(s)
Automated vehicles are rolling data centers that continuously collect, process, and transmit vast amounts of data to navigate safely. This development raises fundamental questions: What data is collected? Who has access? And for what purpose? In this third part of our blog series, we shed light on the data protection challenges of autonomous driving under Swiss
I. The vehicle as a data collector
A. What data are we talking about?
To understand the data protection implications, a basic understanding of the data generated during the operation of an automated vehicle is essential. While the specific data collected may vary from manufacturer to manufacturer or even from model to model, as well as according to the level of automation (see the introductory article in this series), the commonly occurring data can be divided into the following categories:
- Environmental data: An automated vehicle perceives its surroundings through a variety of sensors, including cameras, Light Detection and Ranging (LiDAR), radar, and GPS. Using this data, the vehicle or the system behind it creates a coherent environmental model to make driving decisions. The vehicle thus knows its spatial location at all times, which route it must take to reach the intended destination, the correct speed, and can also react appropriately to unforeseen situations (e.g., braking to avoid a collision with another road user).
- Diagnostic data: This is technical information about the operational status and functionality of vehicle components. Internal sensors and control units continuously monitor the vehicle's condition, including not only tire pressure or battery status, but also data such as engine and transmission temperature, the condition of the brake pads, software versions of the control units, or error messages from the onboard computer (so-called Diagnostic Trouble Codes). This data is primarily collected for the purpose of ensuring vehicle safety and predictive maintenance. However, it also helps the manufacturer improve products and manage warranty and liability cases.
- Occupant data: This data collected by interior sensors, concerns the physical condition and behavior of the vehicle occupants, especially the driver. Examples are recordings from interior cameras to monitor the driver's attention and eyelid movement (drowsiness detection) or data from seat occupancy sensors to optimize airbag deployment. The system also collects physiological measurements and interior environmental data, for example, to automatically adjust the seat climate (heating, cooling, drying). The main purpose of this data collection is to increase driving safety by allowing the system to warn the driver of decreasing attention or to intervene in emergency situations (e.g., in a medical emergency). Furthermore, it is used for personalizing vehicle settings, increasing occupant comfort, and for accident analysis.
B. Is this data personal data?
Under the Swiss Federal Act on Data Protection (FADP), personal data is "any information relating to an identified or identifiable natural person" (Art. 5 lit. a FADP).
For some data, such as that concerning the occupants, this personal reference is readily apparent. Drowsiness detection data relates directly to the driver, and recordings from interior cameras showing the face or behavior of the driver and other passengers are also undoubtedly personal data.
However, even vehicle-related data can become personal data, depending on its linkability to a natural person and its intended use.
- The primary link is usually the unique Vehicle Identification Number (VIN). Although the VIN itself only identifies a vehicle, various actors such as vehicle manufacturers, importers, garages, insurance companies, or government authorities (e.g., road traffic authorities) can often link it to the vehicle keeper. If vehicle-related data, such as the diagnostic data described, is collected with a VIN, it can thus be linked not just to the vehicle itself, but also to the vehicle keeper.
- Additionally, it must be possible to derive information about a natural person from the data. While error messages or tire pressure information are not primarily information about a natural person, they can, in combination with the VIN, be used to analyze the driver's behavior in traffic. For example, increased wear on brake pads or frequent error messages might indicate an aggressive driving style. This highlights the relevance of the distinction between the vehicle keeper and the driver: although the data is primarily linked to the keeper via the VIN, the behavioral information relates to the person driving. At the latest, when warranty or liability cases arise, this technical data is directly associated with the vehicle keeper or the driver, making correct attribution crucial.
These examples demonstrate that even vehicle-related information that can be linked to a VIN or a similar identifier, can potentially reveal information about the keeper or another natural person, thus qualifying as personal data.
C. The data storage system for automated driving
The Swiss Road Traffic Act (RTA) stipulates in Art. 25e para. 2 that vehicles with an automation system (meaning Society of Automotive Engineers (SAE) Levels 3–5; see the introductory article of this series) must be equipped with adata storage system for automated driving (DSSAD) which must record certain safety-relevant events (e.g., the start and end of an emergency maneuver, collisions, or the execution of a risk-minimization maneuver). The record must include the type of event and any reason, the date, the time and, for driverless vehicles, the position of the vehicle must be documented (Art. 7 para. 4 Ordinance on Automated Driving, OAD).
The device only records when the automation system is activated, i.e., when the vehicle is in autonomous operation and "takes" the relevant driving decisions independently, during which time it must not be possible to deactivate the DSSAD (Art. 25f para. 1 RTA). However, no recording occurs when the driver manually operates the vehicle. Access to the data from the DSSAD is primarily granted to the vehicle keeper (via a standardized interface) and the competent police, judicial, and administrative authorities, with the law regulating the purposes for which they may process this data (Art. 25g RTA in conjunction with. Art. 18 OAD). The vehicle keeper's right of access is subject to a specific restriction: they may only access data stored during journeys made by third parties without their consent if they can demonstrate a legitimate interest in this data in connection with an accident or a violation of road traffic regulations (Art. 25g para. 1 RTA).
The recordings from the DSSAD thus primarily serve to document the performance of the automation system in safety-relevant moments. They record how the automation system acted, but not the driving behavior or driving decisions of the driver (as the driver was not controlling the vehicle at the time of the recorded event). Consequently, such data does not generally qualify as personal data, even if the identity of the vehicle's occupants at the time of the safety-relevant event are known. However, there are exceptions to this:
- The recorded position of the (driverless) vehicle at the time of the recorded event can provide information about the location of the passengers, provided their identity is known (which will often be the case, as e.g., the operator of a robotaxi will usually know their passengers' identities). This will generally, however, be information already known to the vehicle keeper.
- In vehicles featuring a takeover request (an automation system that informs the driver when it reaches the limits of its operational range, i.e., can no longer operate autonomously, e.g. Highway-Pilot; see Art. 2 OAD), the DSSAD logs various events that allow direct conclusions to be drawn about the driver's behavior. For example, when the driver overrides the automation system in certain situations or when a takeover request is triggered because the driver is unavailable, not present, or not wearing a seatbelt (Art. 24 lit. c nos. 3 and 4 OAD). In these cases, the data will be linked to a natural person, at the latest, when this data is used to clarify liability in the context of administrative or criminal proceedings (Art. 25g RTA) or when asserting insurance claims.
For authorized persons, particularly vehicle keepers and authorities, the DSSAD creates a complex data situation: They gain access to a bundle of data, some of which clearly qualifies as purely technical data and some as personal data. This mixture and the difficulty of clearly classifying the data in individual cases makes a careful data protection assessment essential.
In our fictional case study (see the introductory article of this series), the DSSAD would thus at least record the collision with Mr. Grünlich's vehicle and likely also the safety-relevant system error caused by the faulty update, each with a timestamp and the vehicle's position. This recording would be a key piece of evidence to prove that the automation system from CodeDrive Solutions Inc. was defective.
II. Who is behind the wheel? Responsibilities in data traffic
A. The question of the "master of the data"
Automated vehicles collect and process vast amounts of data for various purposes. This raises a very significant data protection law question: Who is actually responsible for ensuring that this data is processed in accordance with data protection requirements?
In the data protection context of automated vehicles, we are looking at a complex network of various actors, including the vehicle manufacturer, vehicle keeper, software and component suppliers, and the vehicle users. When the vehicle decides how it drives, which route it takes and by whom data is collected in the process, who is then the "master of this data"?
B. The legal concept: The decision maker is responsible
The FADP does not link responsibility to the ownership or use of the vehicle, but rather to the decision-making power over the processing of data. It differentiates between the following two data protection roles:
- Controller: The FADP defines the controller as the private person or federal body that "alone or jointly with others, determines the purpose and the means of processing personal data." The controller is thus the central figure in data protection: In particular, they must ensure compliance with legal principles during data processing, fulfil the information obligations regarding the collection and processing of personal data, and to respond to access requests from data subjects.
- Processor (Processor): In contrastthe processor processes personal data only on behalf of and according to the controller's instructions; acting as the "extended arm" of the controller. The crucial point here is that the processor may not use the data for its own purposes. If it does, it becomes a controller itself.
This distinction is often difficult in practice. The decisive factor is who defines the essential parameters, especially the purpose and means of the data processing. Who initiated the data processing, determines the "why" and thus the actual purpose of the processing? Who decides which data is collected, whether and how it is stored, and which systems can access it? The answers to these questions determine who is considered the controller under data protection law.
C. Actors in the automated vehicle ecosystem
At the latest when we apply this concept to the reality of automated driving, the illusion of a single controller dissolves. Instead, a network of actors often process data simultaneously and in parallel. The following example constellations illustrate this (see for the example case, the introductory article in this series):
- Vehicle manufacturer: The manufacturer (the "Sentinel Motors Corp.") is rarely just a supplier of hardware. With modern, connected vehicles, it remains deeply involved in data processing. It generally defines which sensors are installed, what data they collect and how. It regularly uses diagnostic data for the development or improvement of its products but will also use it to defend against liability claims if necessary. Since the manufacturer determines the "whether" and "how" of this data processing and this often serves its own purposes (e.g., research, development, quality assurance), it is the controller with respect to these processing activities. It is not a processor for the vehicle keeper, as it does not act on the keeper's instructions – the vehicle keeper does not dictate how the manufacturer must train its algorithms.
- Vehicle keeper / fleet operator: In our example case, this is "Innovate Mobility AG", the operator of the shuttles. It decides the vehicles' specific routes and collects data in the process and will use certain data for fleet management, billing, and passenger safety. For the related processing of personal data, Innovate Mobility AG is therefore also the controller: It determines the purposes (e.g., passenger transport, fleet control) and uses the vehicles as a means to that end. The fact that it did not build the technical infrastructure (the vehicle) itself changes nothing, because it decides on its use. Insofar as the data recorded by the DSSAD constitutes personal data – for example, because it allows conclusions to be drawn about a passenger's position at the time of a safety-relevant event – the vehicle keeper/fleet operator is also responsible for its processing under data protection law within the scope of its access rights.
- Driver: The driver is primarily the data subject, whose data is processed. He cannot the controller for the processing of his own driving or behavioral data. However, in certain scenarios, the driver can (unlike, for example, a mere passenger) become a (joint) controller if he actively decides on the processing of third-party data, for example, by activating an interior camera that also films the other passengers.
- Software and component supplier: Companies that develop camera systems, LiDAR sensors, or infotainment software often define the technical specifications for data collection by their components. If the associated data processing (also) serves the suppliers` own purposes, they also become controllers. This is shown in two scenarios of the example case:
- Scenario A (Processor): When "CodeDrive Solutions Inc." extracts and processes error codes on behalf of Innovate Mobility AG without using the data for its own purposes (e.g., training its own AI models), it acts as a processor.
- Scenario B (Controller): In practice, however, software providers will often want to ensure they can use real-world driving data, especially to further develop their algorithms. As soon as CodeDrive uses the data from the accident (e.g., image sensor recordings of the red light) to generally improve its image recognition software (and not just to fix the bug for Innovate Mobility), it does so for its own purpose. It thus becomes a (joint) controller.
- Providers of third-party services: For example, app providers in the infotainment system (e.g., for music streaming or parking space searches) or insurance companies with "pay-as-you-drive" tariffs are generally the controllers for the data they collect via their own services.
- Garages and maintenance workshops: For the processing of personal data as part of a repair or service order, the garage will generally be the controller. In the context of automated vehicles, this could involve, for example, the disclosure of diagnostic data to the manufacturer, for instance, for warranty claims.
D. The complex issue of joint controllership
The complex interplay between the various actors often makes assigning sole responsibility difficult. Instead, a case-by-case basis must determine whether joint controllership exists or whether several actors are each acting as independent, sole controllers for different processing steps.
Joint controllership arises when multiple actors jointly determine the purposes and means of data processing. This can be the case in particular during the initial data collection, where the interests of the manufacturer, software supplier, and fleet operator are often inextricably linked. For example, the technical design of the sensors by the manufacturer can directly serve the purposes of the fleet operator (e.g., route optimization) and the software provider (e.g., algorithm training). In such constellations, where the processing activities are inseparable for achieving the respective goals, joint controllership is the likely conclusion.
The existence of joint controllership has consequences, particularly from a contractual and liability perspective. Although the FADP, unlike the General Data Protection Regulation (EU GDPR) applicable in the European Economic Area, does not require a formal contract for joint controllership, such agreements will nevertheless be common among joint controllers. This is either because the actors themselves or certain data processing activities are subject to the EU GDPR and the actors are therefore obliged to have contractual arrangements anyway, or because the actors want to explicitly regulate the duties and responsibilities among themselves, even without being obliged to do so. This is generally advisable, because in relation to the data subject (e.g., an accident victim), all controllers involved are in principle liable if a personality right is infringed.
In external relations, the joint controllers are therefore jointly and severally liable for the entire (joint) data processing. This sometimes also applies to the obligation to inform data subjects about the collection and processing of their data and to provide further information upon request. A data subject can therefore, for example, exercise their right of access by contacting any of the jointly responsible actors, who must then coordinate and respond to such data subject requests internally.
This also applies to various processing activities in the case example: To the extent that, for example, Innovate Mobility AG as the vehicle keeper and Sentinel Motors Corp. as the vehicle manufacturer both have an interest in certain driving data (one for fleet management, the other for vehicle diagnostics) and the technical infrastructure is designed in such a way that the data automatically flows to both, joint controllership is likely – along with the consequences described above.
Nevertheless, this is not the case for all data processing activities. Often, there are also parallel but separate responsibilities: For example, the manufacturer may be solely responsible for using diagnostic data for product improvement, while the fleet operator bears sole responsibility for using the same data for billing the passenger. The idea of a "single" controller for the entire ecosystem is incorrect; however, the analysis must be nuanced and can lead to different results depending on the purpose of the processing.
For the vehicle driver, a passenger (like Linda Pünktlich) or involved third parties (such as the victim of the accident Mr. Grünlich), the actual responsibilities are often opaque. Data protection regulations attempt to account for this circumstance by defining the term controller broadly: Anyone who has a decisive influence on the data processing can, in principle, be held accountable under data protection law.
III. Green light for data processing – What must be considered?
The questions discussed so far regarding the data generated and the responsibilities for processing activities are indeed complex in the context of automated vehicles. Fundamentally, however, these are typical data protection issues common to data-driven innovations. This also applies to the following data protection topics:
- The multitude of relevant actors in the automated driving ecosystem creates various data protection challenges. One of these is how actors who process personal data as (sole or joint) controllers, but have no direct contact with the data subject, can fulfill their data protection information obligations—in other words, how they provide their respective privacy policies to the data subjects. For example, data from the navigation system and vehicle sensors (e.g., position, speed, braking events) are processed not only by the vehicle manufacturer but also by the provider of the integrated map service to generate real-time traffic information and improve map quality. Since this provider also uses the data for its own purposes to improve its service for all its customers, it is acting as a joint controller with the manufacturer. However, these background actors have no direct relationship with the driver and thus no way to provide them with a privacy policy at the moment of data collection. Fulfilling the information obligation becomes practically impossible for them.
As a potential solution, the vehicle manufacturer must act as the central and visible entity for the data subject. The contracts between the relevant actors should ensure that it is responsible for ensuring transparency across the entire data ecosystem. This does not necessarily mean that it must fully integrate the privacy policies of all partners into its own document. A more pragmatic and user-friendly approach is for the manufacturer to clearly name the various actors (such as the map service provider) and the data categories transferred to them in its own privacy policy. Additionally, it ensures that the detailed, separate privacy policies of these partners are easily accessible to the user, for example, through in the infotainment system. The manufacturer thus acts as a curator of data protection information, fulfilling its own information obligation while enabling the other controllers to fulfill theirs by creating the necessary access point for the user. For external data subjects like pedestrians, a QR code on the vehicle could also link to a website with the relevant data protection information. However, the practical benefit here is questionable, as pedestrians are generally unable to scan such a QR code on a passing vehicle. - In the automated driving ecosystem, numerous actors process vast amounts of data for a variety of purposes. The principle of purpose limitation serves as a central protective mechanism against an uncontrolled expansion of data processing. It stipulates that personal data may only be processed for the purpose specified at the time of collection, for a compatible purpose, or as permitted by law.
The particular risk lies in secondary purpose deviation: data collected for a legitimate purpose is subsequently used for new, originally unintended purposes. For instance, Innovate Mobility AG could analyze Linda Pünktlich's movement data, initially collected for trip execution and billing, to show her personalized advertising for cafés, restaurants, or similar offers along her usual routes. Such further processing would only be permissible with the consent of Linda Pünktlich. Due to the increased intensity of the intrusion into Linda Pünktlich's privacy, her consent would likely be required in this example anyway.
It is therefore essential for all actors to precisely define the processing purposes from the outset and to communicate them clearly and comprehensibly in their privacy policies. However, this very demand for a clear purpose definition often presents significant challenges in practice, as it is not always possible to definitively foresee all future processing activities, especially with new technologies. - In the context of autonomous driving, camera-based driver-monitoring systems are increasingly common. From a data protection perspective, it is crucial whether these systems collect and process biometric data as defined by law. According to Art. 5(c) FADP, biometric data is data that enables the unique identification of a natural person. This distinction has far-reaching consequences. A simple fatigue detection system, which only analyzes behavioral patterns such as blink frequency or head posture to infer the driver's condition, generally does not generate biometric data. Its purpose is not identification, but state analysis and the image data can be processed locally and ephemerally without a person being uniquely recognized. The situation is different with a facial scan used to identify the driver, for example, to start the vehicle or load personal settings. Here, unique identification is the purpose of the processing, which is why the generated data is considered biometric data and thus as personal data worthy of special protection.
Contrary to a widespread belief, however, processing personal data worthy of special protection, such as biometric data, does not always required a justification. Even if a justification is required, it does not always have to be consent. If biometric identification is inherent to the system for providing a function chosen by the customer and is inseparably linked to the performance of the contract, the manufacturer can rely on this overriding interest—provided that they transparently inform the user. However, explicit consent is unavoidable as soon as the data is used for other purposes or disclosed to third parties. For example, the commercial use of biometric features for marketing or their sale to insurance companies would be unlawful without explicit consent. Furthermore, the large-scale processing of biometric data generally triggers the obligation to conduct a data protection impact assessment (Art. 22 FADP). From a commercial perspective and in the interests of privacy-friendly design, manufacturers should offer customers a non-biometric alternative (e.g., PIN entry) to increase acceptance and minimize legal risks. - A look at our case example illustrates a common scenario in modern IT infrastructures: The data collected by the sensors of the "Sentinel Pod" vehicle are not stored locally in the vehicle or on Swiss servers. Instead, the data is transmitted for analysis to the manufacturer, Sentinel Motors Corp., in the USA.
The FADP follows the principle that personal data may only be disclosed abroad if the recipient state's legislation ensures adequate protection. If this is not the case—and the USA is generally considered a country without an adequate level of data protection—special protective measures must usually be taken. The legal situation here is evolving, which is crucial for companies like Innovate Mobility AG. An important bridge was built with the "Swiss-U.S. Data Privacy Framework" (DPF), which the Federal Council recognized in 2024. US companies can be certified under this framework. If Sentinel Motors Corp. is listed as certified on the official DPF list, personal data may be transferred to it without additional permits or complex contracts. With its adequacy decision, the Federal Council has confirmed that sufficient protection exists for such certified companies—despite possible access by US authorities.
But what if the US partner is not certified? In practice, many US companies lack this certification. In such cases, Innovate Mobility AG must resort to contractual guarantees. The most common solution for this in practice is to agree to apply the Standard Contractual Clauses of the European Commission (Standard Contractual Clauses, SCC). These are recognized by the Federal Data Protection and Information Commissioner (FDPIC), provided that they are adapted to the specific circumstances as required by him. By concluding these clauses, the US (data) importer contractually commits to complying with European data protection standards.
Previously, using SCCs a significant hurdle: it was always necessary to check in a complex "Transfer Impact Assessment" (TIA) whether access by US authorities could undermine data protection (the Schrems II issue). Here, the Federal Council's adequacy decision on the DPF has effectively simplified this, even where SCCs are used: Although a TIA must still be legally conducted, because the Federal Council has determined that the US legal situation is fundamentally compatible with Swiss rule-of-law principles, (data) exporters can, in fact, rely on this official assessment for their risk evaluation.
For our example case, this means: Innovate Mobility AG would check whether Sentinel Motors Corp. is certified under the DPF. If it is not, Innovate Mobility will generally rely on the SCC for the data disclosure, which requires agreeing on these clauses with Sentinel Motors. - A particular challenge can be the correct attributability of the collected data. The problem arises, for example, when the vehicle keeper is not the driver—acommon occurrence when the vehicle is loaned, rented, or used by a family member. A vehicle's systems typically link the resulting driving and behavioral data to the keeper via the VIN which can lead to potentially problematic misattributions. For example, an aggressive driving style of the borrower could be incorrectly attributed to the keeper, which in turn could lead to incorrect conclusions by the manufacturer (e.g., regarding warranty claims) or by insurance companies (e.g., in premium calculations for "pay-how-you-drive" models).
Strictly speaking, vehicle manufacturers in particular would have to provide mechanisms that allow for a clear separation and correct attribution of data to the respective drivers, for example, through driver-specific profiles. Currently, this is often not the case.
IV. Key takeaways
This article illustrates that automated driving is not only a technical and traffic law revolution, but also a data protection challenge. Let's take a summary at the most important topics discussed and their relevance to the example case:
- Almost all data is personal data: Even technical vehicle data such as tire pressure or error messages become personal data when linked to the Vehicle Identification Number (VIN) because they can allow conclusions to be drawn about the behavior of the driver or vehicle keeper and are therefore subject to the FADP. The distinction between keeper and driver thus becomes a central challenge.
- The data storage system for automated driving: The legally required DSSAD records events during autonomous operation. While much of this data, such as purely technical system messages, is not personal data, it becomes personal data as soon as the recordings allow conclusions to be drawn about a person's behavior—for example, a driver's reaction (or non-reaction) to a takeover request. The practical consequence: Accessing parties such as vehicle keepers or authorities must recognize that they are receiving a mixed dataset and are obliged to fully comply with the FADP's requirements for the personal data portion.
- Complex data protection responsibilities: The idea of a single data controller does not fit the reality of autonomous driving. Manufacturers, fleet operators, vehicle keepers, software suppliers, and other actors regularly decide jointly on data processing and are therefore considered joint controllers. This affects, for example, the assertion of data subject rights, as these can generally be asserted against any of the joint controllers.
- Purpose limitation as a guardrail: Data may only be used for the purposes clearly defined and communicated at the time of collection. It is recommended that the actors involved define the necessary and desired data processing as clearly as possible in advance and communicate this to the data subjects in their privacy policies (or through other information).
- Data transfers abroad: Data flows often do not stop at national borders—especially not in the context of automated driving. In this regard, the data protection requirements for disclosures abroad must be observed. For data disclosures to the USA, recognized instruments are available for this purpose with the "Swiss-U.S. Data Privacy Framework" and the Standard Contractual Clauses of the European Commission (including adjustments according to the FDPIC).
You can find out more about this at our event "Autonomous driving – navigating the legal complexities":
https://lunchandlearn2026.events.vischer.com/
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.
[View Source]