Guidance on ChatGPT (or other AI language models) For Regulated Firms - Waystone

      Guidance on ChatGPT (or other AI language models) For Regulated Firms

      Over the last few months many clients have been asking for guidance as it relates to ChatGPT and other natural language AI models.

      First, ChatGPT is a chatbot development platform that uses natural language processing and machine learning to deliver conversational interfaces. ChatGPT is able to understand human speech, make decisions based on this information, and respond with natural language in real time with the accuracy of a human.

      Why regulated firms should avoid using ChatGPT

      At this time, it is not recommended that ChatGPT and other similar tools be used in a professional capacity by regulated firms. While there are some security measures in place, they are not sufficient to guarantee the privacy and security of your data. Furthermore, ChatGPT’s Terms and Conditions state that you grant them full access to all information that you share via their service. While you do own the content that was created, OpenAI, the parent company of ChatGPT, has full access to the data that you submit using their tool and may store that data forever, even if it has been deleted by the user.

      Additionally, ChatGPT does not offer any guarantees regarding the accuracy or quality of its results. Although there have been some recent improvements, such as the addition of punctuation, this is likely to be a continuing issue with AI-powered language models such as this, because there will never be enough training data available for every possible situation encountered when communicating with another human being who speaks English as their first or even second language.

      Recommended use cases for ChatGPT

      Your staff may use ChatGPT at home for any of the following:

      • general interest in AI language models
      • experimentation with how AI language models work
      • education about how AI language models work.

      When ChatGPT is used outside the firm, care should be taken not to expose details about the company. This includes:

      • confidential information (e.g. financials)
      • sensitive topics (e.g. board meetings)
      • company secrets (e.g. employee records)
      • company policies and procedures (e.g.HR practices).

      It is also important that staff do not discuss company resources or assets such as equipment or real estate, since those belong to the firm and not to the employee.

      Information security policies and technical controls

      The firm’s current Information Security Policy should have provisions against Shadow IT, which will cover ChatGPT. Shadow IT is the use of unauthorized applications, hardware, or software by employees. You may wish to remind staff that the use of unapproved applications is a violation of policy and that this includes the use of ChatGPT. If you do not have provisions against the use of Shadow IT, now is the time to include it in your policy.

      In addition to a policy, your firm may implement technical controls to block access to ChatGPT. IT can block the site by forcing web browser settings through a policy and by using a content filtering system (CFS). A CFS is software or hardware that filters websites based on categories, keyword lists or URLs.

      The usage of ChatGPT (or other AI language models) is an interesting topic and it’s clear that the use of chatbots and conversational interfaces is growing. As the technology continues to mature, safeguards will be put in place to ensure that necessary security and privacy considerations are implemented.

      To learn more, get in touch with our Cyber & Data Protection team today. 

      Previous post Next post

      More like this

      Open comment period for SEC’s proposed cyber security requirements deadline

      The Securities and Exchange Commission (SEC) is seeking public comment on proposed cyber security requirements for investment advisers and broker-dealers.…
      Read more

      SEC Commissioner Lizárraga’s speech at the Digital Directors Network 2023 conference

      Recently, Commissioner Lizárraga spoke at the Digital Directors Network 2023 conference. We can gain valuable insights from the speeches that…
      Read more

      The SEC's cyber security rules are coming in April 2023

      By now everyone should understand the SEC is proposing rule 206(4)-9 under the Advisers Act and 38a-2 under the Investment…
      Read more

      FINRA highlights cyber security as one of the top risks facing the financial industry

      FINRA recently released their “2023 Risk Profile” report, highlighting cyber security as one of the major threats confronting the financial…
      Read more

      Upcoming cyber regulations - what can you do to prepare?

      On 4 January 2023, the current administration released its Fall 2022 regulatory agenda. In this document they outlined the upcoming…
      Read more

      Cyber Risk in the Middle East – How secure is your firm and its ecosystem?

      Cyber-attacks are the unauthorised exploitation of systems, networks and technologies and they have been a high-risk item on companies' agendas…
      Read more