Delhi | 25°C (windy) | Air: 185%

Security Safe Space: ChatGPT powered productivity apps rising in popularity, but be cautious sharing personal information

  • Nishadil
  • January 08, 2024
  • 0 Comments
  • 2 minutes read
  • 4 Views
Security Safe Space: ChatGPT powered productivity apps rising in popularity, but be cautious sharing personal information

Productivity apps equipped with the promises of “artificial intelligence” are becoming increasingly common. From prioritizing tasks to keeping up with a fitness routine, there’s seemingly a ChatGPT powered productivity app for just about any New Year resolution. However, beneath the surface, it feels like the beginning of a cautionary tale.

. In March 2023, OpenAI released the ChatGPT API, with immediate adoption from heavy hitters like Snapchat for its , but also Instacart, Quizlet, and Shopify for its consumer app. However, due to the API’s rather loose terms of service, which can only be described as if you got a pulse you’re clear, it wasn’t long before developers flooded the App Store with productivity related apps using ChatGPT plugins.

A recent investigation into privacy policies of popular personal productivity apps by security researchers at found “troubling” examples of poor transparency. One particular app was a popular AI chat assistant that uses the ChatGPT API and its existing database to tailor its answers to the user’s prompt.

Despite the app’s App Store page claiming it only uses messages and device IDs to improve app functionality and manage accounts, “Its true data practices are hidden in its privacy policy, which states it collects your name and email, usage stats, and device information,” states PIA. It’s not uncommon for apps to collect this type of user data to sell to third parties or use to build detailed profiles for personalized ads.

This is a drop in the pond to the onslaught of ChatGPT powered food, health, and productivity apps that are available right now on the App Store. AI coding, personal fitness advice, translation – without having to go through the data policies and practices of each, is there a greater takeaway here? Not long after ChatGPT’s popularity explosion in January 2023, regulators and lawmakers expressed grave concerns over the use of personal information in its training data.

Italy even temporarily banned the service last year until better privacy notices were implemented. There are two ways ChatGPT gets a hold of personal information. The first can be to train the large language model (LLM) through bulk data. These uploads mainly include vast amounts of permissionless works like articles, books, blog posts, and other sources of text scraped from all over the Internet.

However, the most notable in this case is through ChatGPT or one of the many apps using its API. Because the chatbot is designed to converse, this can offer a false sense of security, leading users to share sensitive information, such as names, addresses, health data, financial information, and other personal details they usually wouldn’t.

Any information shared, private or not, is stored by OpenAI and, as far as we know, never deleted. While the company claims that data stored is done so without personal identifiers, aka anonymously, I strongly caution against sharing anything private. After all, when signing up for ChatGPT, a phone number is required to help the platform prevent spam bots.

This alone raises concerns about the anonymity of users. Should you avoid apps with ChatGPT integration? Not exactly. But it’s important to exercise caution in anything you enter personal or sensitive information into. A false sense of security could lead to unknowingly sharing sensitive data. And between cybercriminals and sketchy data policies, what you share with these apps could end up anywhere..