Skip to main content

ChatGPt, an AI-powered chatbot that can solve complex questions in seconds, has been warned by Amazon against storing confidential data. Business insiders have reported that Amazon employees are using ChatGPT for research as well as to solve daily problems, based on messages shared on an internal Slack group.

The popularity of ChatGPT has caused the tech industry to sweat ever since it became popular last year, and now even Amazon is feeling the heat. A lawyer for Amazon has reportedly urged employees not to share code with the artificial intelligence chatbot, which was viewed by Insider in internal company communications viewed by the publication.

Insider reported earlier this week that the lawyer specifically requested that employees not share “any Amazon confidential information (including Amazon code you are working on)” with ChatGPT at all. This claim was backed up by screenshots of Slack messages reviewed by Insider. After the company received reports that ChatGPT’s responses have mimicked Amazon’s internal data, the guidance comes as a surprise to the company.

“This is important because your inputs may be used as training data for a further iteration of ChatGPT, and we would not wish its output to contain or resemble our confidential information (and I have already seen instances where its output is closely related to existing material),” the lawyer wrote further, as quoted by Insider.

There is probably no doubt that Amazon is right about ChatGPT obtaining its data, considering that, in a similar story, ChatGPT was alleged to have answered many of the interview questions correctly for a software development job at the company. A review of Slack channel transcripts also done by Insider shows that the AI was able to provide correct answers to software coding questions and was even able to improve some of Amazon’s code through its AI.

“I was honestly impressed!” an employee reportedly wrote on Slack. I am both scared and excited to see what impact ChatGPT will have on the way in which we conduct coding interviews. ChatGPT is still just a novelty at the moment, but many questions surrounding how it may be able to impact our daily lives have surfaced in recent weeks.

It is undeniable that ChatGPT has the potential to play a powerful role in a variety of fields, including education, since it passed a final exam in an MBA level course at Wharton despite some struggles with basic arithmetic (although it was able to pass the final exam in an MBA level course). OpenAI’s CEO believes school administrators need to get over their fears over the technology, since some school systems, such as New York City Department of Education, have banned the technology because of cheating concerns.

Amazon employees are said to have been impressed by the chatbot’s capabilities in the report. As a result of the testing, the Amazon Web Services cloud unit said that ChatGPT did a good job in answering customer support questions and creating “very strong” training documents for their customers. According to reports, engineers used the chatbot to review code as well, and the results were positive. It is reported that ChatGPT, however, had difficulty creating a “rap battle of epic proportions.”.

This does not mean that ChatGPT cannot improve and its developer, OpenAI, may add more capabilities in the coming months. Google is also reportedly working on a ChatGPT rival as many believe that the AI-powered chatbot poses a big threat to its search engine. The key difference between the two platforms is that ChatGPT offers a single answer based on sources available online.

Amazon’s internal Slack channel has many employee questions about how to use ChatGPT. Some employees asked Amazon if there were official guidelines for using ChatGPT on work devices. Others wondered if they were allowed to work with AI tools. An employee is urging Amazon’s cloud computing division, AWS, to clarify its stance on using “generative AI (AIGC) tools.”

Soon, an Amazon corporate lawyer joined the discussion. A screenshot of the internal communication in the Slack channel shows, The lawyer warned employees not to provide ChatGPT with “any Amazon confidential information,” including Amazon code being written. He also advised employees to follow the company’s existing non-disclosure policy, as some of ChatGPT’s responses looked very similar to Amazon’s internal situation.

These exchanges suggest that the sudden emergence of ChatGPT has raised many new ethical questions. ChatGPT is a conversational AI tool that responds to queries with clearer, smarter answers. The rapid proliferation of ChatGPT has the potential to disrupt several industries, including media, academia, and healthcare, prompting efforts to find new use cases for chatbots and their possible impact.

READ MORE: There is now an option to lock tabs that are incognito in Google Chrome

How employees share confidential information with ChatGPT, and what its developer, OpenAI, does with it could become a thorny issue. That’s especially important for Amazon since archrival Microsoft has invested heavily in OpenAI, including a new funding round this week that reportedly totals $10 billion.

Emily Bender, who teaches computational linguistics at the University of Washington, said: OpenAI is far from transparent about how it uses data, but if the data is used for training, I expect companies to think: After several months of widespread use of ChatGPT, is it possible to obtain confidential information of a private company through carefully crafted prompts?”

Amazon has many internal safeguards for employees using ChatGPT. For example, screenshots of the exchange show that when employees use work devices to access the ChatGPT website, a warning message pops up saying they are about to access a third-party service that “may not be approved for use by Amazon Security.”

Employees participating in the Slack channel chat said they could bypass the message simply by clicking on the “Acknowledge” tab.

Staff speculated that the warning popup was to prevent employees from pasting confidential information onto ChatGPT, especially since they hadn’t seen the company’s policy on internal use.