Shadow AI????: What Lurks in the Dark

Shadow IT has been a huge concern for many organizations. The risk is based on the concept of what happens when data moves out of the view of IT security. For example, what happens when sensitive data is saved on a cloud data storage service such as Dropbox or Box in regard to cooperate security? Unless you have a properly deployed cloud access security broker (CASB), data at rest play or have another strategy in place, your data becomes lost in regard to monitoring its security.

There is a new version of this fear coming into light, which is shadow AI. Like with shadow IT, the question needs to be asked about data being accessed by an AI solution. Some questions to consider are the following:

  • Is sensitive data available to the AI system?
  • Who can access results from the AI system?
  • What data is used to train the AI system?
  • What other users, data, etc. are associated with the AI system?
  • How is data used by the owners of the AI system?
  • Where are the results and queries stored?

These are just a few important concepts to consider when thinking about shadow AI. If an AI offering is public, shared offering, then your data becomes part of a bigger system. If you allow access to a system, its data can be accessed and should be considered exposed by the AI system. If your AI system can link internal systems and be used by external parties, those external parties can access your internal systems. As fundamental as this sounds, many new bright shinny AI offerings unfortunately don’t protect sensitive data in a way that is acceptable to the average compliance officer. Your data needs to be your data and how an AI system is trained, used, and accessed needs to be controlled to your tannates.

Darkreading posted a article about this concept HERE. Its a good read to better understand the risk associated with shadow AI.

Leave a Reply

Your email address will not be published. Required fields are marked *

Time limit is exhausted. Please reload CAPTCHA.