Here's what you can do about it
View in browser
Ai Webinar Promo Image (3)

Hi there,

 

AI systems have a security and data privacy problem.

 

In fact, many AI projects won't make it past the CISO due to data security concerns, which is why the success of your AI project depends on how you choose to protect sensitive data.

 

Today I'm excited to share with you a new way to protect sensitive AI data with a product we've recently announced called Cloaked AI. At its core, Cloaked AI is a protective layer that encrypts sensitive AI data stored as vector embeddings while preserving its usability. (If you want to get in on the private beta, join the waitlist.)

Newsletter Cloaked AI Promo (2)

Here are some of the most common questions we get about protecting sensitive AI data.

 

Q: Where is the sensitive data in shared model AI systems?

A: The vector database.

When you take advantage of large models for generative AI, chat, semantic search, etc., sensitive data gets transformed into vector embeddings, which are stored in a vector database. To a human, vectors are meaningless. But to the AI, the vectors contain all of the meaning found in the original sensitive data. Generative AI systems can recreate the original sensitive data to a high degree of accuracy (though in their own style). That means the data stored in vector databases are a significant security and privacy risk for companies that can block AI projects from launching. With Cloaked AI, those problems go away.

 

Q: Is it good enough if a vector database encrypts in-transit and at-rest?

A: No.

In-transit and at-rest encryption is important, but it doesn't stop access in cloud servers. At-rest encryption on cloud servers only helps against stolen hard drive attacks. What you need is application-layer encryption to protect data on running machines. The only product currently coming to the market that works with the most commonly used vector databases is, you guessed it, Cloaked AI by IronCore Labs.

 

Q: If we're encrypting the vector embeddings, will that affect performance?

A: It's negligible. 

The time it takes to encrypt is dwarfed by network call times. The only asterisk is a question of where the keys are coming from and whether they are in-memory or have to be fetched from a remote server, which can impact performance. We have many key management strategies that address these concerns.

 

Q: How can I get access to the private beta to start testing this?

A: Sign up or email me!

We've got a waitlist signup on our website, and you can also respond to this email. In August, we're hosting a webinar, and I invite you to join us for the inside scoop on Cloaked AI.

Join the Webinar

Thanks for reading along! And please respond to this email if you want early access to Cloaked AI.

Patrick Walsh CEO IronCore Labs

Until next month,

Patrick Walsh
CEO, IronCore 

LinkedIn
Twitter
github
mastadon

IronCore Labs, 1750 30th Street #500, Boulder, CO 80301, United States, 3032615067

Unsubscribe Manage preferences