Information Security

Is Chat-GPT a risk for your organization? With 3 immediately applicable tips!

Did you know that everything you and your collaborators type into ChatGPT's window can be stored to train the language model. In other words, every note, confidential document and piece of code can become part of the ChatGPT database. Therefore, it is wise for an organization to establish policies to mitigate these risks. What can you include in these policies? Here are three tips:
This article was last updated on
14/5/2024

Settings:

1. In your ChatGPT settings, turn off "Chat history & training". Note that this will remove your chat history. Also, this setting is not synchronized with your account. So you need to set this separately for all your devices. ChatGPT is currently working on a business subscription. With this subscription, your data will not be used for training the model while still allowing you to keep your history.

Classification levels

2. Set classification levels for the data within your company. This is also an integral part of the ISO 27001 standard. For example, financial statements can be classified as "secret," personnel files as "confidential," a policy document as "internal," and a product manual as "public. You can then agree that information classified as "secret" and "confidential" should not be shared with ChatGPT.

Two-factor authentication

3. Set up two-factor authentication (2FA) for ChatGPT. Unfortunately ChatGPT itself does not yet support 2FA, but you can still set it up with 2Stable's Authenticator App. See the comments how to set this up yourself. The news of last Tuesday, in which the data of more than 100,000 ChatGPT accounts were stolen, shows again how important a second lock on your account is.


Do you have other questions about the risks of ChatGPT for your organization? Feel free to contact us! We are actively working on these topics and would be happy to help.

Ruben den Dulk
Information Security Consultant
085 773 60 05
To news overview
KAM Certifications is now Fendix

We are a partner of