The time to address secret sprawl is now: Here’s how

Slack has come under siege for using customer data to train its global AI models and generative AI add-on. Sure, requiring users to manually opt-out via email seems sneaky (isn’t avoiding email the whole point of Slack?), but the messaging app doesn’t bear all the responsibility here. The most popular workplace apps have all integrated AI into their products, including Slack AI, Jira AI-Powered Virtual Agent, and Gemini for Google Workspace. Anyone using technology today — especially for work — should assume their data will be used to train AI. That’s why it’s up to individuals and companies to avoid sharing sensitive data with third-party apps. Anything less is naive and risky.

Rohan Sathe

Co-founder and CTO of Nightfall AI.

Trust no one

There’s a valid argument floating around the internet that Slack’s opt-out policy sets a dangerous precedent for other SaaS apps to automatically opt customers in to share data with AI models and LLMs. Regulating bodies will likely examine this, especially for companies working in locations protected by the General Data Protection Regulations (but not the California Consumer Privacy Act, which allows businesses to process personal data without permission until a user opts out). Until then, anyone using AI — which IBM estimates is more than 40% of enterprises — should assume shared information will be used to train models.

link

Leave a Reply

Your email address will not be published. Required fields are marked *