ARTICLE
16 March 2026

The Use Of Generative Artificial Intelligence In The Workplace: Legal Risks And Recommendations For Companies

E
Egemenoglu

Contributor

Egemenoglu is one of the largest full-service law firms in Turkey, advising market-leading clients since 1968. Egemenoğlu who is proud to hold many national and international clients from different sectors, is appreciated by both his clients and the Turkish legal market with his fast, practical, rigorous and solution-oriented work in a wide range of fields of expertise. Egemenoğlu has been considered worthy of various rankings by the world’s most leading and esteemed rating institutions and legal guides. We have been ranked as Recognized in “Project and Finance” and “Mergers and Acquisitions” areas by IFLR 1000. We also take place among the top- tier law firms of Turkey at the rankings of Legal 500, at which world’s best law firms are regarded, in “Employment Law” and “Real Estate / Construction” areas. Also our firm is regarded as significant by Chambers& Partners in “Employment Law” area as well.
The Personal Data Protection Authority ("Authority") published an informational document entitled "The Use of Generative Artificial Intelligence Tools in the Workplace" on 5 March 2026.
Turkey Privacy
Egemenoglu are most popular:
  • within Transport, Real Estate and Construction and Intellectual Property topic(s)

The Personal Data Protection Authority ("Authority") published an informational document entitled "The Use of Generative Artificial Intelligence Tools in the Workplace" on 5 March 2026. The document draws attention to the risks associated with the use of publicly accessible generative artificial intelligence tools offered by third parties in the workplace and sets out a number of recommendations for companies.

The document defines "Generative Artificial Intelligence" ("GenAI") as artificial intelligence systems trained on large-scale datasets that are capable of producing content in various formats - such as text, images, video, audio, or software code - in response to user prompts. These tools may enhance efficiency across a wide range of business processes, including drafting e-mails, summarising texts, supporting research activities, and facilitating idea development. However, the widespread use of these technologies also entails various legal and institutional risks, particularly when deployed through third-party platforms.

A. The Risk of "Shadow Artificial Intelligence" ("Shadow AI")

The Authority defines "Shadow Artificial Intelligence" as the use of generative artificial intelligence tools by employees in business processes without the knowledge, approval, or oversight of the company for which they work.

Shadow AI use typically arises from employees seeking to accelerate their work or reduce routine task burdens. However, the transfer of meeting notes, internal correspondence, or information relating to corporate data to external generative artificial intelligence platforms may give rise to significant risks for companies.

The principal risks highlighted by the Authority are summarised below:

  • Risks Relating to Decision Quality and Accuracy: When GenAI tools are used without being subject to any oversight or review process, they may produce misleading or inconsistent information.
  • Risks Relating to the Protection of Intellectual Property and Trade Secrets: Sharing product information, business strategies, trade secrets, and other confidential information with GenAI tools may lead to such information being used in the training of artificial intelligence models, being disclosed, or losing corporate control over them, thereby increasing the risk of intellectual property infringement.
  • Risks Relating to Corporate Reputation and Loss of Trust: The ability of GenAI tools to generate inaccurate yet convincing outputs that do not reflect reality may give rise to errors that are difficult to remedy within corporate processes and may result in reputational harm.
  • Risks Relating to Information Security and Cybersecurity: The use of these tools outside corporate oversight may expand the attack surface to which companies are exposed by increasing the risks of malicious software and unauthorised access.
  • Risks Relating to the Protection of Personal Data: The sharing of personal data with GenAI tools outside corporate oversight may give rise to risks of unlawful data processing and data breaches under the Personal Data Protection Law ("PDPL").

B. What Actions Can Companies Take?

The Authority notes that a blanket prohibition on the use of GenAI is neither practical nor advisable, as it may in fact encourage Shadow AI use. Companies are therefore recommended to establish a corporate policy grounded in guidance, balance, and awareness, rather than a prohibitive approach.

The principal actions that companies may consider in this regard are as follows:

  • Establishing a Corporate Usage Policy: Companies should establish a policy that clearly sets out which GenAI tools may be used for which activities, what categories of data may be entered into these tools, and the limits of permitted use.
  • Cautious Approach Regarding Sensitive Information and Personal Data: Care should be taken when sharing information that is sensitive from a corporate perspective, as well as personal data, with GenAI tools, and the privacy policies of the relevant tools should be carefully reviewed. It is recommended that anonymised, generalised, and abstract language be preferred over proper names or identifying expressions when interacting with these tools.
  • Preventing Over-Reliance on Outputs (Automation Bias): Accepting system outputs as accurate without scrutiny and incorporating "hallucination" content - which may appear convincing yet does not reflect reality - into business processes should be avoided. Data obtained from GenAI tools should not be used as the basis for final corporate decisions; rather, they should be treated as supplementary inputs subject to human oversight and evaluation.
  • Implementing Data Security and Access Control Measures: Employees should be required to use only tools that have been approved by the company and for which conditions of use have been established. In this context, network-level access restrictions, device-based limitations (such as restricting access to corporate devices only), and role-based access approaches may be considered.
  • Raising Employee Awareness and Establishing Feedback Mechanisms: Policies established by the company should be regularly updated and communicated to employees. Training activities should be conducted to raise awareness of potential risks and matters requiring attention, and feedback mechanisms should be put in place to enable employees to share their experiences and the issues they encounter. This approach will contribute to increasing institutional awareness and identifying areas for improvement.

C. Conclusion

While generative artificial intelligence tools offer significant efficiency and speed advantages in business processes, the emergence of these technologies - particularly in the form of Shadow AI use - may give rise to serious legal and operational risks for companies in relation to data security, the protection of trade secrets, and the processing of personal data.

It is therefore important for companies to approach the use of generative artificial intelligence not merely as a technological efficiency tool, but also through the lens of corporate risk management and data protection. In this context, it is of considerable importance to establish clear and workable corporate policies, define limits on the sharing of sensitive data, strengthen access and oversight mechanisms, and raise employee awareness.

Preventive and guidance-oriented measures taken in this direction will contribute to the safe utilisation of the opportunities offered by generative artificial intelligence technologies and will significantly reduce the likelihood of companies encountering potential legal and reputational risks.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

[View Source]
See More Popular Content From

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More