|
Neural networks are constantly trained on a huge array of data from the Internet and user content. Thus, ChatGPT remembers the details of requests and can build responses based on the information received. For example, to solve a work task, an employee can instruct the chatbot to write part of the code or corporate text for him. To do this, the employee can give the bot confidential information about his company, counting on it not being distributed further, noted Petr Kutsenko, head of the R-Vision Endpoint component at R-Vision.
"But the language model, having answered the task, will record the wording of the question in the cloud. As a result, this data is saved, and due to the imperfection of the algorithms, the content writing service information can be obtained by third parties," he noted to RSpectr.
AI parses information from open sources or works with what the administrator has given it, Alexey Parfentyev, head of the SearchInform analytics department, noted in a conversation with RSpectr. According to him,
there is no espionage in the use of neural networks, but there are still violations of the law - after all, the data may be personal, they may be processed and stored incorrectly
For example, consent for processing has not been obtained, and security and storage measures are not followed, he added.
Currently, users are practically unaware of the architecture of AI construction and the scheme of its interaction with the outside world, Mikhail Yemelyannikov, managing partner of the consulting agency Yemelyannikov, Popova and Partners, commented to RSpectr. He is confident that
Sharing sensitive information, including personal data, with ChatGPT and similar AI-powered services is prohibited.
Mikhail Emelyannikov, Emelyannikov, Popova and Partners:
– Just the other day, a report came out from the Massachusetts Institute of Technology and the University of California that self-developing AI models can create AI subsystems without human assistance and participation. What and how will these subsystems process, where will this happen, who will receive the results?
While there are no clear answers, it is necessary to limit the use of AI for processing confidential data, he stressed.
TAKE ACTION
As with any cloud technology, anything that is not within your organization’s IT footprint cannot be considered yours, Nikita Nazarov, technical director of HFLabs, emphasized in a conversation with RSpectr.
ChatGPT is a cloud technology. Can you trust it with commercial information and personal data? The expert noted that.
|
|