Plugging humans into the AI playground

Plugging humans into the AI playground

Finastra is leveraging generative artificial intelligence to enhance creativity and productivity by adopting Microsoft’s Bing Chat for its employees, says Adam Lieberman

Guest contributor |


OpenAI’s ChatGPT has created a completely new artificial intelligence arena. This marks the first time a large language model (LLM) and its generative capabilities have become available for mass consumption.

The surge in popularity of ChatGPT compelled businesses worldwide to consider how it impacts their workforce. However, with any new technology comes new potential risks. Whether it’s concerns about security and data, the blurred lines of intellectual property (IP), accentuating bias or ‘hallucinations’, all companies need to implement meticulous risk mitigation strategies for any potential scenario.

At Finastra, we recognise the value of generative AI to enhance creativity and productivity when rolled out in a safe and secure way. We searched high and low to find the right solutions to help us unlock our true potential. That’s why we chose to become an early adopter of Microsoft 365 Copilot and roll out Bing Enterprise Chat for all our employees.

So, how did we do it and what are we using it for? For our customers, our security is their security. They rely on us to provide industry-leading solutions without compromising risk. As a long-standing partner of Microsoft, with many of our solutions deployed on Microsoft Azure, we have always valued the robust level of security, flexibility and scalability it offers.

Microsoft’s Bing Chat Enterprise is an AI-powered chat. Trained on data from the web, the tool searches the internet for information in real time. Yet it is an internal tool – our user and business data remain within our own networks. Chat data is not saved, nor is it used to further retrain any underlying models. We set our own parameters to ensure responsible usage and to protect sensitive information.

Every employee plays a role in safeguarding our security, which is why we also published our internal Enterprise Generative AI policy. This includes ensuring that we all validate information, do not share restricted data – such as personal, Finastra IP or customer data – and do not generate content that infringes on IP rights. We have also emphasised the importance of monitoring outputs for potential biases and taking corrective actions to mitigate them.

Our committees and working groups oversee and manage our governance frameworks, risks, compliance and data management. I am personally leading our AI and machine learning (ML) team, known as GenAI Labs. We support generative AI tools at Finastra and help to develop additional standards, procedures or guidelines alongside the risk and legal teams.

Bing Chat acts as our own digital assistant, providing us with vast amounts of knowledge at our fingertips and the headspace to focus on work that matters most. It enhances creativity, innovation and productivity. All Finastra employees have access to our online course to upskill on generative AI more broadly, to enhance both their career and personal life skills.

We have outlined use cases for the tool that my colleagues are already employing, such as drafting emails and letters, creating spreadsheet calculations and formulas, and producing first drafts for longer-form content. In fact, some of the content in this article you’re reading was generated, with some editing, by Bing Chat!

As Finastra is part of Microsoft 365 Copilot Early Access Program, our employees can soon take this one step further. Copilot combines the power of LLMs with our data in its applications. We will soon be able to ask Copilot to generate meeting synopses and action lists following Microsoft Teams calls, as well as product launch plans on Microsoft PowerPoint and intuitive graphs to inform financial decision-making.

For developers, we’ve rolled out Finastra’s Secure Zone. The data platform is designed for production-grade data engineering, data science and ML. Designed with usability and security in mind, it provides teams with the ability to not only experiment and develop prototype AI solutions, but also to develop production models – both traditional and generative AI – with full monitoring capabilities.

We’re providing a secure, holistic environment that makes our developers’ day-to-day more productive and enjoyable, with access to capabilities such as cookbook style tutorials and a platform to manage ML workflows. The Azure OpenAI Service provides access to OpenAI’s language models, which users can employ for tasks like content generation, summarisation, semantic search, code translation and more. In a bid to ensure that our team has the greatest and latest, we’re also planning to roll out GitHub Copilot in due course.

We owe it to our employees to upskill and prepare them for the future of work. We believe in the importance of empowering our people to effectively, and securely, utilise innovative tools to create jobs that are meaningful, creative and inspiring, and that ultimately create better outcomes for our customers.

Adam Lieberman is head of AI and ML at Finastra

This article was originally published in the Autumn 2023 issue of Technology Record. To get future issues delivered directly to your inbox, sign up for a free subscription

 

Subscribe to the Technology Record newsletter


  • ©2024 Tudor Rose. All Rights Reserved. Technology Record is published by Tudor Rose with the support and guidance of Microsoft.