Understanding generative AI: Credera at the AWS London Summit 2023Debbie Griffin
Having recently secured the AWS Premier Partner Tier accreditation, Credera is now working even more closely together with AWS to offer our clients the best advantages available from working in the AWS cloud, including improved time-to-market for products and additional support on product production pieces.
When we attended the AWS London Summit earlier this month, we learned about AWS’s approach to generative AI – something that is becoming of increasing importance for organisations to achieve competitive advantage in the marketplace.
In this blog, we cover what we learned about the Amazon Bedrock service at this year’s event and why this is an important tool to add to our clients’ AWS service portfolio.
What is generative AI?
Generative AI can create new content and ideas including conversations, stories, images, music, and videos. Like all AI, it is powered by machine learning (ML) models that are pre-trained on a vast amount of unstructured, internet-scale data, and these are commonly referred to as foundation models (FMs).
Recent advancements in ML technology have led to the rise of FMs containing 100s of billions of parameters and variables which go into their makeup. This means they can be very complex to build - so what makes them so popular?
Generative AI models are the most recent evolution in technology and leverage the latest advancements in ML. This new generation of AI means that a single FM can perform multiple tasks at the same time, made possible by the model being pre-trained through billions of parameters. The models can be instructed in different ways and be asked to carry out an assortment of tasks, but these are all being pushed through the same single FM.
Listen next: Technology Tangents | Unlocking innovation: The generative AI revolution in business
Why should you care about it?
Generative AI has actually been around for many years, and in its relatively short lifespan, it has raised awareness of AI and brought it into the mainstream. More recently, you may have heard much more about it in the guise of systems such as ChatGPT. Aside from the novel uses being made of this by individual consumers – often to lighten the workload of a task you don’t really want to do - how can this technology really improve our business ways of working? How can you leverage generative AI to make a difference to our organisations?
Here are some key points to consider:
- You can customise FMs using your data for domain-specific operations.
- It can help you enhance product offerings for the marketplace using just a small amount of your data.
- Fine-tuning of the model can be done quickly.
- A lot less effort is needed to generate and create your product.
- Less time and money is needed in terms of compute.
How do generative AI Models (FMs) differ from traditional ML models?
Some of the key differences between generative AI models and more traditional ML models are outlined below:
- With generative AI models, you no longer need to label every item of data – a task that can be very time consuming and could make us less competitive in terms of time-to-market for new products.
- You can now use the same machine learning model - a foundation model - for many, if not all, of the AI tasks you want to perform. This simplicity also feeds down into the coding, where all that you need to add is the model name you wish to use.
How can you quickly start taking advantage of generative AI Models?
Most of the time, we don’t want to manage the model or the infrastructure ourselves. The great news is that AWS are set up to do all of the heavy lifting for you, helping to create the working environment, deploy the model, and handle the scaling up and down of the model. This means all you have to do is issue an API call which states what you want the model to do - for example, to generate some text based on our instructions. To help you get started, AWS have recently introduced Amazon Bedrock – Amazon’s latest offering that makes FMs from leading AI startups and Amazon available via an API.
Amazon Bedrock is the easiest way to build and scale generative AI applications with FMs. Bedrock helps you find the model you need for your use case - some can be fine-tuned to make them more specific to your business use case and can be easily integrated into your existing applications.
Some of the key benefits of Amazon Bedrock include:
- It accelerates the development of generative AI applications using FMs through an API, without managing infrastructure and scaling.
- It allows you to choose FMs from AI21 Labs, Anthropic, Stability AI, and Amazon Titan to find the right FM for your use case.
- You have the ability to privately customise FMs using your organisation’s data.
- It allows you to enhance your data protection using comprehensive AWS security capabilities.
- Responsible AI is provided by supported model providers; Amazon Titan supports AI best practices.
The aim of Amazon Bedrock is to quickly integrate and deploy FMs into your applications and workloads that are running on AWS using familiar controls and integrations. It provides users with the depth and breadth of other AWS services such as Amazon SageMaker and Amazon S3.
AWS Generative AI Incubator
Amazon also have a programme called the AWS Generative AI Incubator Programme, where applied scientists come on-site and help run discovery workshops to help you identify what your top use cases are for generative AI and help you get to these much faster than on your own. These scientists then help you write your proofs of concept (PoCs) for these use cases, supporting you until you get to a stage where you can decide if these products can go into production and if they will add the value that you want.
Generative AI models Q&As
1. Where is the generative AI model stored?
These models are stored in the model operator’s account but none of your data is stored outside of your own AWS account.
2. Where am I sending my data to?
Your data never leaves your own AWS customer account.
3. Who can see my data?
This is controlled by you in the same way you grant access and permissions to any of your AWS resources. In fact, you can build a generative AI model in your AWS account which will follow your own organisation’s data encryption and security policies.
4. Will my data be used to train other ML models?
Amazon have architected their solution and they state that:
- They will not use your inputs or data for anything other than what you are asking for it to be used for. This is not configurable. It is not an option.
- Customer data is not used to improve the Amazon Titan models for other customers and is not shared with other FM providers.
- Customer data (prompts, responses, fine-tuned models) is isolated per customer and remains in the region where it was created.
5. Is my data secure?
Amazon state that:
- Customer data is always encrypted in transit with a minimum of TLS 1.2 and AES-256 encrypted at rest using AWS Amazon Key Management Service (KMS) managed data encryption keys.
- Integration with AWS Identity and Access Management Service (IAM) to manage inference access, allow/deny access for specific models, allow AWS Management Console access, etc.
- Use AWS CloudTrail to monitor all API activity and troubleshoot issues as you integrate with applications.
- Fine-tuned (customised) models are encrypted and stored using customer AWS KMS key. Only you have access to your customised models.
6. Will the results from the model be full of offensive content?
AWS only work with responsible model providers. The models ensure that inappropriate content cannot be put into the model and the outputs are filtered for this too.
In a nutshell
Generative AI is a hot topic and one of the latest tools to allow organisations to gain additional insights from data as part of a broader data strategy. Whilst it is certainly powerful and carries lots of potential, it is important to recognise that it is not a ‘silver bullet’.
In his recent CIO publication, Generative AI won’t automate your way to business model innovation – CIO, Brian Solis posed the question, “If work and technology are to serve the purpose of making businesses better in this digital renaissance, the question is, better for who? And what does better look like? What makes it more meaningful?” This lens would be a very useful one to take when deciding if generative AI is the answer or part of the answer to your organisation’s digital and business transformation.
How we can help
As a Premier Consulting Partner, we strive to make an extraordinary impact for our clients by solving their business challenges whilst harnessing the power of AWS services to simplify, modernise, and scale solutions at pace. We have first-class expertise in AI/ML, MLOps, Data Analytics, and DevOps, and using our extensive industry knowledge, we aim to help our customers in Financial Services, Public Sector, and Energy, to continue to gain a competitive advantage.
AWS Deep Dive Days event 2023: Unlocking the potential of AI and ML
Credera at the AWS Partner GameDay 2023
Omnicom’s Credera announced support for AWS for Advertising & Marketing initiative
Credera becomes AWS Security Competency Partner
Seven challenges of Oracle to PostgreSQL migration and how to address them
Enterprise cloud service offering: Key challenges and how to mitigate them