Google, creator of Bard, warns employees about the dangers of chatbots

midian182

Posts: 9,745   +121
Staff member
Facepalm: In what should probably be a red flag for the rest of us, Google parent Alphabet is warning all its employees to be cautious when using AI chatbots, even its own Bard. The company has also told its engineers to avoid directly using code generated by these services.

A Reuters report citing four people familiar with the matter states that Alphabet has advised its workers not to enter confidential information into AI chatbots.

There have been warnings about not oversharing with generative AIs since ChatGPT rocketed into the public eye earlier this year. The NCSC, part of the UK's GCHQ intelligence agency, said that sensitive user queries, such as health questions or confidential company information, are visible to the provider and may be used to teach future versions of chatbots.

Samsung banned the use of ChatGPT in May after three incidents of its semiconductor fab engineers entering sensitive data into the service. Amazon has banned employees from sharing code or confidential information with the chatbot, and Apple has banned it completely. Even its creator, OpenAI, advises users to be wary of what they are typing into the prompt.

In addition to human reviewers potentially reading sensitive data that users have entered, there's also the risk of this information being exposed in a data leak or hack. Back in March, OpenAI took ChatGPT's chat history feature temporarily offline after a bug in the service caused the titles of other users' conversations to appear in the user-history sidebar found on the left side of the webpage.

Despite the restrictions, a recent survey of 12,000 professionals found 43% said they use AI tools such as ChatGPT for work-related tasks, one-third of whom do so without telling their boss.

Alphabet also told its engineers to avoid the direct use of code that chatbots generate. When asked why, the company said Bard can make undesired code suggestions, but it still helps programmers. Google also said it aimed to be transparent about the limitations of its technology.

Bard hasn't had a smooth life so far. It started by generating the wrong answer in its first demo in February. A few months later, we heard that Google employees reportedly told the company not to launch the chatbot, calling it a "pathological liar," "cringe-worthy," and "worse than useless."

There was more bad news for Bard this week when it was revealed that Google won't be launching the chatbot in Europe yet due to privacy concerns.

Permalink to story.

 
As junior developer, I consider ChatGPT and other AI as savior because the amount of insult you get from experienced developers on different websites is unbearable.

I am aware of course about all the data leaking stories. But can we nowadays use anything online or unfortunately even offline without leaking datas
 
People are dumb enough to put their private and personal information online on social media for others to see. Does Google actually think people are smart enough to not put private information into a chatbot? HAHAHAHAHA!

People are f'ing stupid. You could literally download a virus to most people's devices and have it do a pop-up prompt that tells them if they click on "YES" it will install a virus. Most people will just click yes to clear the pop-up so they can continue with whatever they're doing. They don't read, they don't think and they can't comprehend things happening around them because they just want to get to their wanted results.
 
ChatGPT writes python code well.

I mean the human being paid for coding should check it before making it live but still. It’s good code.
^THIS

This was going to be my comment. I have found ChatGPT to be amazingly good at Python and use it almost daily. However, I would never use it blindly and I don't even know how a software developer would use a piece of code without thoroughly reviewing and testing it first. ChatGPT is an amazing coding assistant and is able to create 80% of the boiler plate code and interfaces for me. I still need to do a fair amount of online researching to make sure I understand it, fix minor bugs, and test it. I don't know what kind of engineers work at Google that would actually have this issue.
 
Back