WitnessAI Establishes Guardrails for Generative AI Models

WitnessAI

Generative AI can sometimes produce inaccurate, biased, or harmful outputs. The question is: can it be made “safe”? Rick Caccia, CEO of WitnessAI, believes it can.

Caccia, formerly the SVP of marketing at Palo Alto Networks, compares AI models to sports cars: “Having a powerful engine is useless without good brakes and steering. Controls are as crucial as the engine itself.”

This analogy resonates with many enterprises that, despite seeing the potential productivity benefits of generative AI, worry about its limitations and risks. An IBM poll revealed that 51% of CEOs are hiring for new generative AI roles, but only 9% of companies feel equipped to handle associated threats, including privacy and intellectual property risks, according to a Riskonnect survey.

WitnessAI’s Risk Mitigation Platform

WitnessAI’s platform intercepts interactions between employees and their company’s custom generative AI models, such as Meta’s Llama 3, rather than models accessed via APIs like OpenAI’s GPT-4. It then applies policies and safeguards to mitigate risks.

“Enterprise AI aims to democratize access to data so employees can perform better. However, overexposure of sensitive data, or data leaks and theft, are significant concerns,” says Caccia.

WitnessAI offers various modules to tackle different generative AI risks. One module allows organizations to set rules preventing employees from misusing AI tools, such as querying about pre-release earnings or sharing internal codebases. Another module redacts sensitive information from prompts sent to models and employs techniques to protect models from off-script attacks.

“We aim to address the problem of safe AI adoption and offer a solution tailored to that need,” Caccia explains. “CISOs want to protect their businesses, and WitnessAI helps by ensuring data protection, preventing prompt injection, and enforcing identity-based policies. Chief privacy officers need to comply with regulations, and we provide them visibility and reporting tools.”

Privacy Considerations

WitnessAI’s platform routes all data through its system before reaching AI models, raising potential privacy concerns. The company addresses this by isolating and encrypting customer data to prevent leaks.

“We’ve built a low-latency platform with regulatory separation, creating a unique, isolated design to protect enterprise AI activity,” says Caccia. “Each customer has a separate instance of our platform, encrypted with their keys. Their AI activity data is exclusive to them — we can’t access it.”

Despite Caccia’s reassurances, the potential for workplace surveillance remains a contentious issue. Surveys indicate that employees generally dislike having their activities monitored, as it can negatively impact morale. A Forbes survey found that nearly a third of respondents might consider quitting if their employer monitored their online activities and communications.

However, interest in WitnessAI’s platform remains high, with 25 early corporate users in the proof-of-concept phase and general availability expected in Q3. WitnessAI has also raised $27.5 million from Ballistic Ventures and GV, Google’s corporate venture arm.

Future Plans and Market Position

The funding will help WitnessAI expand its team from 18 to 40 by year-end. This growth is essential to compete with other companies offering model compliance and governance solutions, including tech giants like AWS, Google, and Salesforce, and startups like CalypsoAI.

“We’ve planned for sustainability into 2026 even without sales, but we already have a robust sales pipeline,” Caccia says. “This is our initial funding round and public launch, but secure AI enablement is a new field, and our features are evolving with this emerging market.”

See also: Chinasa Okolo Explores AI’s Influence On The Global South

Kafkai: Revolutionizing Content Creation with AI
AI-Powered Social Engineering Dominates 90% of Cyber Attacks

Trending Posts

Trending Tools

FIREFILES

FREE PLAN FIND YOUR WAY AS AN TRADER, INVESTOR, OR EXPERT.
Menu