IAPMESuisse
|By IAPME Suisse, AI & SME Consultant

nLPD and AI: Obligations for Swiss SMEs in 2026

Everything Swiss SMEs need to know about the new Federal Data Protection Act (nLPD) and the use of artificial intelligence. Obligations, penalties, and best practices.

nLPD and AI: Obligations for Swiss SMEs in 2026

The new Federal Data Protection Act (nLPD), which came into effect on September 1, 2023, has significantly altered the legal framework within which Swiss companies operate artificial intelligence technologies. For Swiss SMEs, this regulation represents both a compliance challenge and an opportunity to differentiate themselves through ethical and transparent practices.

As AI adoption accelerates across the Swiss economic landscape, it is crucial to understand the concrete implications of the nLPD on your AI projects. This article outlines the legal obligations, potential risks, and practical measures to implement.

Reminder: What is the nLPD?

The new Federal Data Protection Act (nLPD) replaces the outdated 1992 law, which was no longer suitable for contemporary digital challenges. Inspired by the European GDPR while retaining Swiss-specific features, it significantly strengthens the rights of individuals and the obligations of data controllers.

Key principles applicable to AI

The nLPD is based on several principles directly relevant to AI systems:

  • Transparency: Any individual whose data is processed must be informed.
  • Purpose limitation: Data can only be used for the purpose announced during its collection.
  • Proportionality: Only strictly necessary data should be processed.
  • Accuracy: Data must be correct and kept up to date.
  • Security: Adequate technical and organizational measures must protect the data.

These principles fully apply to AI systems, whether it's a chatbot, a customer scoring tool, or a recruitment algorithm.

Specific obligations for SMEs using AI

1. Enhanced duty of information

When an SME deploys an AI system processing personal data, it must inform the individuals concerned in a clear and understandable manner. This information must specify:

  • The identity of the data controller
  • The purpose of the automated processing
  • The categories of data processed
  • Any potential recipients of the data
  • If applicable, the transfer of data abroad

Concrete example: If your SME uses an AI-powered CRM to analyze customer behavior, you must inform them that their interactions are analyzed by an algorithm, for what purpose, and which data is used.

2. Data Protection Impact Assessment (DPIA)

Article 22 of the nLPD requires a data protection impact assessment when processing poses a high risk to the personality or fundamental rights of the individuals concerned. AI systems often present this level of risk, particularly when they:

  • Create personality profiles
  • Process sensitive data on a large scale
  • Systematically monitor publicly accessible spaces
  • Make automated decisions with legal effects

Estimated cost: A full DPIA for an AI project costs between CHF 3,000 and CHF 15,000 depending on complexity, a reasonable investment considering the potential penalties.

3. Maintaining a record of processing activities

Any company with more than 250 employees must maintain a record of processing activities. Below this threshold, the obligation still applies if the processing poses a high risk — which is often the case with AI.

This record must document:

  • Each AI system used and the data it processes
  • The legal basis for the processing
  • Security measures in place
  • Any subcontractors involved
  • Data retention periods

4. The right to human decision-making

A crucial point for SMEs using AI in their decision-making processes: Article 21 of the nLPD grants individuals the right to demand that an automated decision with legal effects be reviewed by a human.

In practice, if your AI system automatically rejects a job application, denies a loan, or determines a price, the individual concerned can request a manual review of this decision.

Penalties for non-compliance

The nLPD provides for criminal penalties of up to CHF 250,000 for responsible individuals. Unlike the GDPR, it is not the companies but the individuals (executives, IT managers) who are personally targeted.

The most common violations related to AI include:

| Violation | Maximum fine | |---|---| | Failure to inform | CHF 250,000 | | Breach of due diligence with subcontractors | CHF 250,000 | | Violation of minimum security requirements | CHF 250,000 | | Failure to report data breaches | CHF 250,000 |

The Federal Data Protection and Information Commissioner (FDPIC) also has extensive investigative powers and can order the modification or cessation of non-compliant processing.

Practical guide: 7 steps to ensure nLPD compliance for your AI projects

Step 1: Map your existing AI processing activities

First and foremost, create a comprehensive inventory of all AI tools used in your SME. This includes SaaS solutions with integrated AI, even when this component is not highlighted by the provider.

Step 2: Assess the risks of each processing activity

For each identified AI system, determine the risk level based on the data processed, volume, purpose, and potential consequences for the individuals concerned.

Step 3: Update your privacy statements

Your legal notices and privacy policies must explicitly mention the use of AI, the data processed, and the rights of the individuals concerned.

Step 4: Review your contracts with AI providers

Each AI solution provider is considered a subcontractor under the nLPD. A contract compliant with Article 9 must govern the relationship, specifying security measures, confidentiality obligations, and data return or destruction modalities.

Step 5: Secure cross-border data flows

Many AI tools (ChatGPT, Google Cloud AI, AWS) host their data in the United States. The nLPD requires specific guarantees for these transfers: standard contractual clauses, certification, or explicit consent.

Tip: Opt for solutions hosted in Switzerland or the EU whenever possible. Local alternatives exist for many use cases.

Step 6: Document your automated decision-making processes

For each automated decision made by AI that significantly impacts individuals, document the algorithm's functioning, the criteria used, and the human review mechanism.

Step 7: Train your teams

nLPD compliance is not solely the responsibility of the legal department. Every employee using AI tools must understand the basic principles of data protection and the best practices to follow.

The specific case of generative AI

The use of generative AI tools (ChatGPT, Claude, Midjourney) by SME employees poses specific challenges:

  • Risk of inadvertent disclosure: Data entered into a generative AI tool may be used to train the model. It is essential to formally prohibit the entry of personal or confidential data.
  • Hosting outside Switzerland: Most of these tools are hosted in the United States, requiring additional guarantees.
  • Lack of control over outputs: Generated content may contain erroneous or biased information, potentially exposing the company to liability.

A generative AI usage policy, validated by management and communicated to all employees, is a minimum requirement.

Interaction with the European AI Regulation (AI Act)

Although Switzerland is not directly subject to the European AI Act, which has been gradually implemented since 2024, Swiss SMEs exporting to the EU or serving European clients must consider it. The AI Act's requirements for transparency, risk management, and human oversight usefully complement the nLPD framework.

Moreover, the Federal Council has announced plans to develop a specific regulatory framework for AI, potentially inspired by the European approach. Adopting best practices now is a prudent preventive investment.

nLPD compliance budget for an AI project

| Item | Estimated cost (CHF) | |---|---| | Initial compliance audit | 2,000 – 5,000 | | Impact assessment (DPIA) | 3,000 – 15,000 | | Updating legal documents | 1,500 – 4,000 | | Team training | 1,000 – 3,000 | | Annual support | 2,000 – 8,000 | | Total first year | 9,500 – 35,000 |

These amounts, while significant, are far lower than potential penalties and represent an investment in the trust of your clients and partners.

Conclusion

nLPD compliance in the context of AI projects is not optional for Swiss SMEs: it is a legal obligation with significant personal penalties. Beyond the regulatory aspect, a responsible approach to AI is a genuine competitive advantage in a Swiss market where trust is a cardinal value.

The key is not to wait for an incident to act. A proactive, structured approach proportionate to your company's size will allow you to fully leverage AI while respecting the rights of your clients, employees, and partners.


Want to evaluate the nLPD compliance of your AI projects? Request your free audit and receive a personalized diagnosis within 48 hours.


Related articles

External resource