AWS re:Invent 2022: The latest updates, news, and more from final day three
Welcome to our live coverage of AWS re:Invent 2022, as the cloud computing giant takes over Las Vegas to show off and celebrate its successes and reveal a whole host of new releases, upgrades and more.
It’s been a bumper week so far, with keynotes from AWS CEO Adam Selipsky and Swami Sivasubramanian, Vice President of AWS Data and Machine Learning covering literally the entire globe, and out into space too.
The final day of the conference starts with a keynote from Dr. Werner Vogels, Vice President and CTO, AWS, which is always a highlight – so stay tuned for all the latest AWS re:Invent 2022 news and more with our live blog!
Good morning and welcome to day 1 of AWS re:Invent 2022!
We’re up bright and early this morning (thanks jet lag) and full of excitement for Adam Selipsky’s keynote todya, where we’ll no doubt see and hear about all the company’s latest news and releases.
We’ve still got a few hours to go, so if you’re already excited, why not check out our live blog from last year’s event to get in the mood?
And we’re in! It’s a much smaller keynote hall than previous years, so the crowds are intense.
Speaking of intense, in typical US keynote session style, we’re being “treated” to some ear-splitting rock music at 8am…Linkin Park’s “Numb”, anyone?
Up this morning (when the rock finishes) is AWS CEO Adam Selipsky.
Selipsky took over the top job from Andy Jassey (now Amazon CEO) in 2021, having previously been CEO at Tableau Software
Even for Vegas, this is loud….Mr Brightside anyone? Apparently it’s “the Vegas anthem”…
We’re 10 minutes out from the keynote!
And after a rousing rendition of “Don’t Stop Believing” to round us out, it’s time for the keynote!
Adam Selipsky up next…
Taking to the stage to the sounds of “Sweet Child O’ Mine”, Adam Selipsky is here!
“There’s no challenge too big that we can’t solve,” he notes, highlighting the good work done by customers such as BMW and Riot Games
“Things that people said couldn’t be done, like bringing capital markets to the cloud – are getting done,” Selipsky notes.
But a huge number of start-ups are also choosing AWS.
Sustainability is also a big problem, Selipsky notes, calling it “the issue of our generation”.
He reveals that AWS is now the largest corporate purchases of renewable energy, and wants to be 100% renenwable by 2025 – and being water positive by 2030.
AWS’ size means it can see challenges of all kinds affecting the world, with inflation, suply chain disruption, energy prices, and war all big challenges.
However cloud can prove a way for businesses to save money.
“If you’re looking to tighten your belt, the cloud is the place to do it,” Selipsky says.
You still need to innovate, Selipsky notes – and the cloud helps you do that, allowing firms to be flexible with fewer resources.
“You want to be ready for anything”, he notes
Our first project showcase is taking us to space.
Selipsky harks back to Galileo’s advances in viewing the stars, all the way up to the rocket age and the new age of infra-red imaging to reveal much more in our images of space.
AWS has been helping its users gain knowledge from a suitably vast area – the vast realm…of data.
“It’s almost amazing to think about how much data there is,” he says, looking forward to the “data explosion” coming over the next few years.
Samsung, Expedia and Pinterest are all among those firms generating huge amounts of data every day.
“Just as the vastness of space means you can’t explroe it with just one technology, the same can be said for data.”
You need a number of the right tools, all integrated together, with governance and security, and also visualize data in a way to get valulable insights for your business.
We’re moving on to look at Amazon Aurora, the company’s database service, which has apparently been a huge success since its launch in 2021.
This is backed up by AWS’ realm of analytics tools to help get all the insights you need.
Our first project announcement – a preview of Amazon OpenSearch Service for operational analytics – this means AWS is provided serverless analytics across all its services, and “no one else can say that”, Selipsky notes.
Now on to machine learning and AI.
Amazon Sagemaker has proved incredibly popular, Selipsky notes, with tens of thousands of customers across a wide range of industries, able to train models with billions of parameters.
There may be more news to come on this tomorrow though, when Swami Sivasubramanian, AWS VP, Database, Analytics and Machine Learning, does his keynote.
Our first customer highlight comes from energy giant Engie, who tells us about how it is using AWS to accelerate the transition to a carbon-neutral economy.
This includes using predicitive analytics, IoT and more to spot potential breakdowns, and utilizing cloud platforms to manage decentralized energy.
AWS helps Engie centralize and manage its quality data, storing more than 1 petabyte of data to be used across a thousand projects, all built on AWS S3.
Selipsky is back now, and wants to talk about how AWS can always provide the right tools for your business to explore its data.
However this also raises the issue of integration – how do you get the most out of all these tools, all working together?
Combining data from different data sources, and different data tools brings up a number of troubling challenges, he notes, with complexity and usability a major issue.
Luckily, AWS has been building integrations across many of its services…
“But what is we could do more?” he asks.
Selispky highlights how the transaction of data is often a major pain point for businesses, especially as more and more data is being generated every day.
Luckily there’s a new integration between Amazon Aurora and RedShift, giving near real-time analytics and ML on transactional data. Data appears within seconds, and updates automatically and continuosuly, giving you all the latest data when you need it.
“Now you really have the best of both worlds,” Selipsky notes.
Apache Spark is another vital tool, especially across Sagemarker, EMR and AWS Glue…but not on Redshift.
Until now that is – there’s a new Amazon Redshift integration for Apache Spark, making it “incredibly easy” to run Spark queries on Redshift data – and there’s no need to move any data.
Now it’s on to governance – meaning the right people get the right access to the right data, with the right controls in place.
Finding the right balance between control and access is crucial, but it’s different for every organization, Selipsky says.
You need to make sure the two are balanced, so establishing the right governance gives your workers trust, encourages innovation.
In order to fix this, there’s a new AWS service to sovle these problems
Amazon DataZone, a new data management service to catalog, discover, share and govern data.
This will bring people and data closer together, with full integration with key AWS tools for data analysis such as Redshift.
“There’s really nothing else like it,” Selipsky notes.
Now it’s on to more on those insights, specifically Amazon QuickSight, which has had a vast expansion of updates and upgrades recently.
This includes new operational paginated reports, which gives businesses an easy-to-view oversight into their data analytics.
Amazon QuickSight Q is a way to ask questions and get answers into your business data, but what if it were able to answer questions about the future?
ML-powered forecasting with Q will do just that, allowing Q to predict the future using its machine learning capabilities. There’s even “why?” questions to spot exactly why some issues or problems might have emerged, allowing Q to run analysis on past data, and offering suggestions.
From space, we’re now venturing into the oceans, down into the depths of the sea.
Selipsky highlights the likes of the bathysphere and SONAR as key tools in helping us explore and analyze the oceans, giving us the “confidence to explore”
With the right protections and tools, we can do more, he notes…just like with the modern world of data!
AWS’ security is a key selling point for many customers, Selipsky says, with the company’s global secure infrastructue network protecting hospitals, banks, governments and more.
AWS lets you be “secure and agile” – there’s no need to trade off one for the other, he notes, giving your business the confidence to build and explore.
This includes Amazon GuardDuty, which lets users detect and respond to security threats at scale.
The company revealed EKS protection for GuardDuty earlier this year to protect containers from the outside – but now, to protect what’s inside, there’s a new capability that adds container runtime threat detection.
AWS was a founder member of the recently-launched Open Cybersecurity Schema Framework, but wants to make it even easier to help businesses detect and protect from threats.
Namely, the new Amazon Security Lake service. This will make it easy for security teams to automatically collect, combine, and analyze security data at petabyte scale.
This could be a huge step forward for AWS security, as users will also be able to run queries using Amazon Athena, as well as integrate up to 50 partner analytics systems, in order to drill down precisely into security threats.
Next, we’re looking into the weather – namely, in terms of the extremes (in any direction) of what can affect your business and its data, and having the right tools to do so.
Selipsky uses the story of Amundsen vs Scott in their race to the South Pole in order to show how extreme conditions can make even small decisions have major effects.
Namely, “good enough simply isn’t good enough” when it comes to facing extreme challenges – and building on AWS.
Selipsky highlights how F1 teams are using AWS to reduce data-heavy simulation times by 70%, and Epic Games uses AWS to keep 100 million players supported.
There are over 600 instance types for virtually every workload, giving users the broadest and deepest compute choice.
These are powered by AWS hardware, including AWS Nitro, and the Graviton processors, now into its third generation, and boasting a huge number of high-profile customers.
Selipsky announces new C7gn instances for EC2, powered by Graviton3, giving huge advantages in network-intense workloads such as analytics.
Now on to machine learning workloads, a key area for AWS.
With models continuing to grow in size, scale and complexity, the need for powerful infrastructure to train and support these models is vital.
The company is now boosting its Inf1 sytem, which is ideal for smaller models, with new Inf2 Instances for EC2.
This should offer 4x higher throughput, and 1/10th the latency of Inf1 instances, providing a major step forward.
HPC is up next, with HPC workloads another growing area of importance for AWS.
“The scale of the cloud is redefining HPC,” Selipsky notes, adding that customers need infrastructue tailored to their exact performance needs at scale, but also tools that are easy to use.
AWS offers a number of suitable tools, but is now looking to go a step further, with the launch of new Hpc7g instances powered by Graviton3E and EFA.
But that’s not all, as data and memory-intensive use cases get new Hpc6id instances designed for their specific needs.
Our next customer highlight comes from Siemens, which tells us how it is working with AWS to help move its industrial software into the cloud, as well as helping scale up good ideas from start-ups around the world.
Selipssky is back, and we’re into the home straight for this morning’s keynote now.
We’re now turning to simulations, and how important they can be in fields from weather to construction.
A new field of “spacial simulations” are growing in popularity though, being used by cities to help model traffic, new housing projects, or disaster response.
Such models are incredibly complex, and require 3D models and engines – but until now, most of these simply weren’t powerful enough to work in an effective way.
AWS has now launched AWS SimSpace Weaver to run large-scale simulations without being contained by your hardware, or managing infrastructuture – meaning devs can spend more time working on a simulation.
It also integrates with top 3D engines such as Unreal and Unity, and could be a real game-changer in building the world of tomorrow, pushing the boundaries of what’s possible, Selipsky notes.
Finally, after all this exploration on Earth – it’s time to explore the world of “imagination”.
Selipsky notes it is “a world we can all explore…a world of total possibility.”
Removing constraints and bringing together different experiences to build something new can be incredibly valuable, Selipsky notes, quoting JM Barrie and JRR Tolkein as examples.
But exploring imagination can also be a collaborative experience, as people come together to share ideas and build on initial suggestions via collaboration.
“Technology has changed how we explore our imagination together…the world is all one neighbourhood, and we can all come out to play,” Selipsky says.
When it comes to removing constraints, the cloud from AWS has allowed businesses to scale up and down quickly – but the company can do more.
In contact centers, Amazon Connect has helped power more than 10 million interactions per day, and is now getting a number of new expansions.
This includes ML-driven forecasting, capacity planning and scheduling, to make sure the right agents are available at the right time.
Contact Lens is also getting agent performance mangement, and a new agent workspace with guided step-by-step actions.
Supply chains have been hit hard over the last two years, and AWS is now looking to help companies address inventory issues to let them react to such problems.
The company is announcing AWS Supply Chain, a whole new cloud app system looking to improve visibility and provide actionable insights and lower costs.
It can provide real-time maps of your business’ supply chain, letting you drill down into precise locations to spot any issues, using machine learning to predict any problems, as well as offering solutions so you can react fast, as well as saving costs and mitigating risks.
AWS is also looking to help companies bringing together data from different environments.
The new AWS Clean Rooms offering looks to help companies and partners securely analyze and collaborate on data sets without sharing possibly insecure information, helping firms better understand their own customers and allow joint data analysis.
For example, in advertising, companies can now create relevant campaigns, while protecting customer data.
Selipsky is now looking at AWS’ work in healthcare, with a range of specialized tools helping firms and bodies across the world improve patient care and treatment.
It is launching Amazon Omics, a prupose-built service to store, query and analyze genomic and other “omics” data to help build the next generation of healthcare solutions.
Next it’s on to retail, where Amazon’s own “Just Walk Out” technology has proved popular in its Go and Fresh stores, as well as inspiring Amazon One, its palm-recognition technology.
Wrapping up, Selipsky notes how challenging environments can be the key to building business success.
“This is the real power of the cloud,” he notes, “we are so passionate because we see what you are doing with the cloud – businesses look at the cloud as it provides innovation we’ve never seen before.”
And with that, we’re done for the morning!
A whole host of intriguing announcements and launches to digest, with AWS seemingly poised to launch into a whole host of new areas.
We’re going to grab a coffee and get investigating all of the news in more detail – so until tomorrow’s keynote with Swami Sivasubramanian, stay tuned to TechRadar Pro for all the latest AWS re:Invent 2022 news!
Welcome to day two of AWS re:Invent 2022!
We’re (kind of) well-rested and (definitely) re-caffeineated, and ready for this morning’s keynote from Swami Sivasubramanian, Vice President of AWS Data and Machine Learning.
We’re expected a deeper dive into some of the services and tools announced yesterday, and possibly some new product announcements too.
Bit of a different musical vibe this morning ahead of the keynote, as a woman with the sparkliest jacket ever is spinning some more dance-y tunes to warm up the crowd.
It’s time for this morning’s keynote – and after another space-themed intro video about growing and nurturing business ideads, Dr Swami Sivasubramanian, Vice President of Data and Machine Learning at AWS, takes to the stage.
He notes how many great breakthrough scientific discoveries and lightbulb moments have appeared to be happy coincidences, but actually with a huge amount of past information influencing them.
Great breakthroughs can happen when our insights are paired with past information – and the same can happen with organizations too, Sivasubramanian notes.
However unlike the human brain, business data isn’t centralized, he notes. Building automation is key to helping us get the most out of our data, which doesn’t naturally flow like it does in the brain.
“I strongly believe data is the genesis for modern invention, he says, “it is absolutely critical that today’s organzizations have the right systems.”
Sivasubramanian harks back to the early days of Amazon, when it built early forms of recommendation engines to help book shoppers find new products.
15 years on, AWS is still holding a torch for innovation in the data space, he notes.
More than 1.5m companies across the world now come to AWS for their data needs, whether that’s database, analytics, or machine learning services.
There are three key core elements of a data strategy, Sivasubramanian notes – the first of which is building a future-proff data foundation.
This to him means building the right services, so that you don’t need to keep re-engineering as your needs involve.
This includes access to the right workloads and tools for every workload, giving performance at scale, removing heavy lfiting, and finally be reliable and scalable.
More than 94% of AWS customers use 10+ of its database and analytics services, Sivasubramanian notes – there’s no one-size-fits-all offering here.
Machine learning capabilities are a major selling point too – with a huge range of services that make it easier to build and deploy ML models end-to-end.
“All of these services…enable you to store and query your data,” he says.
Now a look at Amazon Athena. AWS made it easy to use, Sivasubramanian notes, making it incedibly popular, with tens of thousands of customers signed up already.
Sivasubramanian reveals Amazon Athena for Apache Spark, giving a much more intuitive way to run complex data analytics, starting up in under a second, meaning you spend more time on insights, rather than waiting on results.
Apache Spark runs 3x faster on AWS, Sivasubramanian claims.
When it comes to having performance at scale, AWS offers many tools your busines need to help provide scalability and processing power, Sivasubramanian says.
Next, its the launch of Amazon DocumentDB Elastic Clusters, a fully-managed solution to scale document workloads of virtually and size or scale. It can elastically scale workloads in minutes, and can even automatically manage underlying infrastructure, saving developers months of time.
Our first customer showcase comes from Rathi Murthy, CTO of Expedia Group, who discusses how AWS helped the firm as it scaled quickly, now processing 600 billion AI predictions per year, powered by 70 petabytes of data.
The company uses tools including Amazon EKS, DynamoDB and SageMaker to get the most out of all its data.
Sivasubramanian returns, and now it’s time to talk about removing heavy lifting – namely through DevOps using AI and machine learning tools such as SageMaker.
SageMaker, which offers a full end-to-end ML journey, now handles trillions of predictions every month as customers utilize big data to build smarter ML models.
Preparing unstructured data (which makes up 80% of all data) for ML is labor-intensive, though, and many companies want a simpler way to get to grips with it.
Sivasubramanian highlights the particular challenge of geospatial data, which has grown hugely in populairty in recent years across a number of industries.
To help with these challenges, the company announces that Amazon SageMaker now supports Geosptial ML, giving access to all kinds of different data with just a few clicks.
A demo of the new capabilities shows how it could be used to in natural disasters, predicting dangerous road conditions due to rising flood water levels, making it easier for emergency services to access affected areas.
Such models can cut down response times from days to minutes, potentially saving lives.
Sivasubramanian now turns to reliability and security – a crucial consideration for businesses everywhere.
“You need to put the right safeguards in place,” he notes, highlighting how AWS has a long history of building such secure and reliable services.
However customers are always asking for more, particularly when it comes to apps and databases – so a new Amazon Redshift Multi-AZ feature will help deliver high availability and reliability for workloads.
To help manage the security of PostgreSQL services, there’s also new Trusted Language Extensions for PostgreSQL, a new open-source project to support extensions on Amazon RDS and Aurora.
Now on to more Amazon security products – namely the likes of GuardDuty, to protect against evolving security threats.
It’s getting an Aurora-themed extension via the new Amazon GuardDuty RDS Protection, using ML to accurately detect suspicious activity.
Sivasubramanian is now moving on to discuss how weaving “connective tissue” can help support and secure your organization.
He uses the metaphor of Indian rainforest living tree bridges, created by locals to help connect their isolated villages to vital resources and urban centers.
Similarly, your business can use quality tools and data to drive future growth, using a system of co-operation to connect siloed teams, creating strong pathways to vital resources.
Looking now to data lakes, Sivasubramanian highlights how maintaining data quality requires manual rule creation.
To help with this, there’s the new AWS Glue Data Quality, which helps users build confidence in their data so they can make better-informed critical decisions every day, reducing manual efforts from days to hours.
As for governance, rather than being a hindrance, using the right system could actually help your business move faster.
Sivasubramanian notes how customers were asking for smarter, more intuitive governance systems that aren’t so time-intensive – and so the company has introduced Centralized Access Controls for Redshift Data Sharing, giving easily-managed data access controls for Redshift data.
There’s also a host of challenges when it comes to using machine learning for governance, Sivasubramanian notes.
To address this, Amazon is launching three new capabilities for SageMaker – ML Governance Role Manager, Model Cards and Model Dashboard.
All these services should make using ML for governance much simpler and more straightforward, Sivasubramanian says.
Now on to the newly-annoucned Amazon DataZone, which helps users catalog, discover, share and govern data across an organization.
Sivasubramanian says he’s been an early user of the service, and introduces a demo of DataZone which shows off how easy it is to pull together the data your business needs to stay useful and efficient.
The demo shows how sales and marketing teams could use DataZone to create more effective advertising campaigns, bringing together different teams to get the most out of their data.
Sivasubramanian returns, and it’s time to talk about getting the right pathways to vital resources.
Connected data stores are critical for survival, he notes, but connecting data often requires complex ETL pipelines.
As mentioned by CEO Adam Selipsky yesterday, AWS is moving towards a zero-ETL future, Sivasubramanian notes, bringing services closer together to remove conflict and make everything just work more smoothly.
This includes Aurora getting a zero-ETL integration with Redshift, removing heavy lifting and obstructions across the board.
Customers are looking for easier ways to move and analyze big data, Sivasubramanian notes, and AWS wants to do just that.
There’s a new auto-copy feature for Amazon Redshift from S3, making it easier to create and maintain simple data ingestion pipelines.
AWS is attacking the problem of data sprawl, Sivasubramanian notes, and wants customers to be able to seamlessly connect all their data.
Users can stream data in real-time from over 20 AWS and third-party sources using Kinesis, and use SageMaker and Appflow to bring in data from a wide range of sources.
Amazon AppFlow now offers 50+ connectors, Sivasubramanian notes,with the likes of Facebook Ads, Google Analytics and Salesforce now newly available.
The company now offers connections to hundreds of data sources, he notes, and to show off how this can be a game-changer, it’s time for the next customer showcase.
AstraZeneca is using a whole host of AI and ML services to research, create and build new treatments for lots of diseases, and is using AWS to manage its huge databases of data, moving 25 petabytes of data across its global network.
Sivasubramanian is back, and moves on to talk about how AWS is working on democratizing data with tools and education.
The company is helping students around the world with little access to technology, with Sivasubramanian remembering his upbringing in rural Southern India where he only had access to a computer for 10 minutes a week at school.
The AI workforce is expected to add a million jobs by 2029, but finding the right skills and candidates to fill these vacancies will be a significant challenge.
AWS is helping community colleges and MSIs to up their education push, with a new AWS Machine Learning University training program for educator training, giving hands-on training sessions to help prepare the next generation of workers.
AWS is also building out its AI and ML scholarship program, awarding $10 million to 2,000 selected students.
Data literacy is also a crucial consideration when it comes to professional training, with AWS providing ML tools and certifications to boost knowledge for users everywhere, including more than 150 new courses launched in the last year alone.
Low-code and no-code tools are also a vital addition for many businesses, with the likes of Amazon QuickSight offering an introduction to such systems, built for the cloud.
CEO Adam Selipsky unveiled QuickSight Q yesterday, giving natural language questions to the platform to get clearer insights, which Sivasubramanian says can be a major breakthrough for users everywhere.
SageMaker Canvas is also offering a low-code option for users, and Sivasubramanian introduces Warner Bros Games to explain how they used the platform to manage some of its most important launches.
Wrapping up, Sivasubramanian notes how all these factors play a critical role in helping your business do more with your data.
“It’s individuals who create these parts…but it’s leaders who empower them with the right tools,” he notes.
And that’s a wrap from Sivasubramanian – a lot to digest and consider, so we’re off to find out more on just how these platforms and services can be used.
Thanks for sticking with us this morning – there’s much more to come later today and tomorrow, so stay tuned to TechRadar Pro for all the AWS re:Invent 2022 news!
In keeping with the slightly bizarre musical intros to our keynotes this year, this morning’s entertainment features a string…trio playing classical versions of top 40 hits.
Hearing Billie Eilish’s “Bad Guy” on strings is definitely a haunting way to start the day.