Concerns have been raised regarding the safety of China’s DeepSeek, a free AI chatbot that is outperforming its American equivalents.
According to cybersecurity experts, the app does not seem particularly harmful at first glance, but it nevertheless poses serious privacy threats because it complies with Chinese regulations and is an artificial intelligence tool that has the potential to gather and manipulate any information users provide.
All large language models, or LLMs, the kind of AI-powered sophisticated chatbot made well-known by OpenAI’s ChatGPT, are constructed by initially gathering enormous volumes of data, and they function in part by gathering user input. DeepSeek is no different from ChatGPT, albeit being more effective.
All businesses are required by Chinese legislation to collaborate and support Chinese intelligence operations, which may expose Chinese government surveillance of data held by Chinese businesses. That is not the case in the United States, where access to data maintained by American tech corporations typically requires a court order or warrant.
However, you can reduce the amount of data you transfer to China by using DeepSeek. Most people outside of China do not have a Chinese phone number or email address, which is required to create an account in order to use the app or chatbot on deepseek.com.
According to Lukasz Olejnik, a researcher at King’s College London Institute for AI and an independent consultant, users should be cautious about giving DeepSeek any private or sensitive information, he told NBC News.
When entering trade secrets, financial information, sensitive personal data, or medical information, exercise caution. According to China’s data rules, anything you input could be saved, examined, or requested by officials, Olejnik added.
This means that users of DeepSeek should exercise extra caution if they have any cause to be afraid of Chinese authorities, according to Ron Deibert, head of the University of Toronto’s Citizen Lab.
According to Deibert, users who pose a significant danger to mainland China, such as journalists, members of targeted diaspora populations, and human rights activists, should be especially aware of these concerns and refrain from entering any data into the system.
Using a different email address for DeepSeek rather than the one you use for other crucial services is one method to cut down on the amount of data you transmit to China. This could make it more difficult for the app or possibly Chinese intelligence services to compare your identity on other websites with what you give DeepSeek.
For those who are more tech-savvy, they can download the DeepSeek AI model and ask it questions directly, bypassing the Chinese corporation that handles the requests. According to Olejnik, this not only keeps China from viewing any information you provide the model, but it also implies minimal or no filtering over subjects that are prohibited in Beijing.
Concerns have also been raised by DeepSeek’s privacy policy, which states that it gathers a lot of private data from users, such as the type of device they are using and their keyboard patterns or rhythms. Although that might be intrusive to some, it is not unheard of and only applies to what a user types into the app, not what they type into other apps: For instance, Facebook and TikTok provide methods for monitoring user mouse and keystroke movements.
While providing information to a Chinese LLM has some hazards, Deibert warned that there are also risks associated with American LLMs.
According to Deibert, all AI platforms, including those with US headquarters, are subject to the same hazards.
According to Deibert, other digital businesses in the United States have been attempting to court President Donald Trump recently and get similar sensitive data. Before utilizing or entering any data into what are essentially black boxes, anyone who is even slightly critical of the administration, serves as a watchdog for the administration, or belongs to a community that is vulnerable or at risk should proceed with extreme caution. Recall that user data is a component of the raw material used to train those systems, just like it is for almost all social media platforms, he noted.