We dug into our own data to find out how some big companies are using Amazon Web Services. We also tried to ask a few architects/engineers working in these companies to corroborate with these stories. Here are some real-world examples of how the biggest companies are using AWS .
Financial Technology · Mountain View, CA · TurboTax, QuickBooks, Credit Karma, Mailchimp
The most interesting thing Intuit does with AWS is catch fraud in real time. Their risk system watches over a trillion dollars of payments a year and decides in under a second whether each transaction is legitimate.
Signals from across Intuit's products flow in, and machine learning models built on Amazon SageMaker and Bedrock score each one. The answer comes back fast enough that a customer checking out on QuickBooks or filing through TurboTax never feels a pause.
What makes the problem hard is the tradeoff. Block too many transactions and good customers get frustrated and leave. Block too few and fraud losses pile up. The team tunes the system constantly to push fraud losses down while keeping approval rates high for real customers.
The same setup handles the boring-but-critical parts too: identity checks, compliance rules, and dispute handling. All of it moves through the same pipelines, so a single transaction can be scored, verified, and logged for auditors in one pass.
Design Software · San Francisco, CA · AutoCAD, Fusion, Revit, Maya
Autodesk uses AWS to power universal search across its design software. When an architect using Revit, an engineer using Fusion, or a filmmaker using Maya wants to find a project file, a part, or a model, the search has to dig through every kind of data Autodesk stores and bring back the right answer fast.
The search platform is built on Java services running on AWS, with OpenSearch and Elasticsearch handling the heavy lifting of indexing and querying. It sits underneath all three of Autodesk's industry clouds, Fusion for manufacturing, Forma for architecture, and Flow for media, so one search bar can reach across designs that look completely different under the hood.
Autodesk is now layering vector databases and machine learning into the same setup. Instead of just matching keywords, the search can understand what a designer actually means when they ask for "a bracket like the one from last quarter's project," even if those exact words appear nowhere in the file.
The data behind it lives in DynamoDB, MySQL, and S3, with services like Lambda, SQS, and Step Functions wiring everything together. The team treats reliability as the headline feature, since a search outage means designers, engineers, and manufacturers across millions of Autodesk users can't find their own work.
Healthcare & Diagnostics · Burlington, NC · Lab Testing, Drug Development
Labcorp uses AWS to run the data lake behind 600 million medical tests a year. Every blood draw, biopsy, and screening generates results that have to be stored, analyzed, and made searchable for doctors, hospitals, and drug companies, and AWS is where that data lives.
The lake is built on Amazon S3 with Redshift, EMR, Glue, and Athena handling the work of moving and querying it. Structured results from lab instruments flow in alongside unstructured data like clinical notes, then get processed through Glue and Spark pipelines before landing somewhere a researcher can actually use them.
Streaming is part of the same setup. Labcorp uses Amazon Kinesis to push test results in close to real time, so a hospital ordering a panel doesn't have to wait for an overnight batch job to see what came back from the lab.
The team also feeds this data into Databricks for the heavier analytics work, like spotting patterns across millions of patient results that help pharmaceutical companies decide which drugs to advance. The same lake that stores a single patient's cholesterol number is also the one helping researchers figure out what works at scale.
Apparel & Footwear · Greensboro, NC · The North Face, Vans, Timberland, Dickies
VF Corporation uses AWS to forecast demand across its brands, including The North Face, Vans, Timberland, and Dickies. The question every retailer wrestles with is how many of which thing to make and where to put it, and VF runs the math behind that decision on AWS.
The data feeding these forecasts is enormous, every sale across wholesale, e-commerce, and retail stores for brands sold in dozens of countries. AWS handles the storage and processing so a forecast can pull from years of history and return predictions specific to a region, a season, or a single product line.
VF deploys its models on AWS using Lambda for serverless inference and SQS for streaming the prediction requests as they come in from regional trading teams. DynamoDB holds the forecast outputs so any downstream system can pull the latest numbers without hitting the model again.
The same AWS setup now powers pricing optimization, using forecasts to figure out what a Dickies jacket should cost in October versus December. Each model VF ships on AWS pushes the company further from gut-feel buying decisions toward ones backed by what the data actually says.
Athletic Apparel · Vancouver, BC · Yoga, Running, Training Apparel
Lululemon uses AWS to run the supply chain that gets a pair of leggings from raw fabric to a store shelf or someone's doorstep. The company tracks materials from suppliers, monitors quality at factories, and feeds order data into the systems that decide what gets made and shipped where, all on AWS.
The supply chain platform is built on Java services connected through Kafka for live streaming data, with AWS Lambda handling event-driven processing as orders move between systems. When a fabric shipment arrives at a manufacturer or a finished garment leaves a warehouse, that update flows through the AWS pipeline so every downstream system sees it within seconds.
Lululemon runs all of this through a multi-account AWS Control Tower setup with Step Functions orchestrating the longer-running workflows that span sourcing, manufacturing, and logistics. The team uses CloudFormation and Terraform to spin up identical environments across regions, since lululemon ships to dozens of countries and each one has its own compliance and routing requirements.
The same AWS backbone now connects directly into stores and e-commerce, so a sale in Vancouver triggers a replenishment signal that travels through the supply chain in close to real time. AWS is what lets one item moving through one part of the business reach every other part of it without hitting a wall.
Financial Services · McLean, VA · Credit Cards, Banking, Auto Loans
Capital One uses AWS to run the machine learning platform that powers its fraud detection, credit decisions, and customer-facing AI features. The bank serves over 100 million customers, and the models that decide whether to approve a transaction or flag it as suspicious all run through this internal ML platform built on AWS.
The platform handles every stage of the model lifecycle on AWS. Engineers build training pipelines on Amazon SageMaker, package models into containers, and deploy them through Kubernetes clusters running on AWS for scoring at the speeds card transactions require. KServe handles model serving so a single model update can roll out across the bank without downtime.
The same AWS foundation now runs Capital One's generative AI work, including the agent platforms its associates use internally and the AI features showing up in customer products. Bedrock and SageMaker handle the model hosting, while Lambda and API Gateway connect those models into the bank's existing applications.
Capital One was an early all-in AWS bank and still treats that decision as a competitive edge. Every new product the company ships, from credit card decisioning to fraud scoring to AI assistants, gets built on the same AWS platform rather than bolted onto legacy systems.
Financial Services · New York, NY · Market Intelligence, Indices, Commodity Insights
S&P Global uses AWS to run the data and analytics platforms behind its market intelligence products, including the systems that calculate the S&P 500 and Dow Jones Industrial Average. The indices millions of investors track every day are computed and distributed through infrastructure built on AWS.
The index calculation platforms run on Amazon EC2, EMR, and Lambda, with S3 holding the market data feeds and historical pricing that go into each calculation. When a stock price moves during trading hours, that change flows through AWS pipelines and gets reflected in the index value within seconds.
S&P also runs its GIS and geospatial analytics for the energy and commodities business on AWS, processing satellite imagery and pipeline data through ETL workflows orchestrated by AWS Glue and Step Functions. Snowflake sits on top for analytical queries that customers in oil, gas, and metals use to track supply chains.
The newest layer is generative AI, with S&P building research assistants on AWS Bedrock that let customers query decades of financial data in plain language. Each new analytics product the company ships, from credit ratings tools to commodities dashboards, gets built on the same AWS foundation rather than spun up as a separate stack.
Financial Services · Charlotte, NC · Banking, Credit Cards, Wealth Management
Truist uses AWS to run the customer-facing mobile and digital banking experience that millions of Americans use to check balances, pay bills, and move money. The bank's mobile apps and the APIs that power them sit on AWS, where Java microservices on ECS Fargate handle requests from iOS and Android clients in real time.
The mobile teams build REST APIs in Java and Spring Boot, package them as containers, and deploy them through CI/CD pipelines into AWS. SQS queues handle the messaging between services so a tap in the app to transfer money can flow through fraud checks, account validation, and posting without any single piece becoming a bottleneck.
The same AWS foundation runs Truist's identity and access management for cloud workloads, with IAM policies, KMS encryption, and federated single sign-on tying together the bank's Azure, M365, and AWS environments. AWS WAF and Shield Advanced sit in front of customer-facing apps to block bot traffic and DDoS attempts before they hit the banking systems.
Truist also runs Confluent Kafka clusters on AWS for the real-time event streams behind trading, payments, and fraud detection, with Terraform managing the infrastructure as code so a new environment can be stood up identically across regions.
Government Administration · Edinburgh, Scotland · Public Services, Social Security, Disclosure Scotland
The Scottish Government uses AWS to run a shared cloud platform that delivers public services to the people of Scotland, from student finance to social security benefits to criminal record checks. Rather than each agency building its own infrastructure, the Digital Directorate runs one common platform on AWS that all the public sector bodies tap into.
The platform team builds reusable AWS components, with Terraform and CloudFormation defining the infrastructure as code. New services like the upcoming Scottish Government mobile app get deployed onto the same foundation, running on Kubernetes via Amazon EKS and serverless functions via AWS Lambda, with S3, DynamoDB, and Redis for storage.
Social Security Scotland runs the systems that pay out £6.7 billion in benefits each year through this AWS setup, using AWS Cost Explorer and CloudWatch to keep spending in check while serving roughly 2 million people. Disclosure Scotland was the first UK government body to host police data in AWS, relying on IAM, KMS, and CloudTrail to meet the security bar that comes with handling criminal record information.
Software Development · Walldorf, Germany · ERP, S/4HANA, BTP, SuccessFactors
SAP runs its Enterprise Cloud Services on AWS, which is the private cloud managed service that hosts SAP applications like S/4HANA, HANA, BTP, and SuccessFactors for thousands of enterprise customers worldwide. When a Fortune 500 company runs their ERP "in SAP cloud," a big chunk of that workload sits on AWS underneath.
The SAP teams operating this stuff treat AWS as the foundation for the whole stack. Linux servers run on EC2, networking gets stitched together with VPCs, Transit Gateways, NAT Gateways, and Route Tables, and IAM policies plus security groups gate who and what can talk to the SAP systems. Cost Explorer and CloudWatch keep tabs on spend and performance across thousands of customer environments.
For the newer Sovereign Cloud offering, where SAP guarantees customer data stays inside specific countries like the UK, Canada, and Australia, SAP architects multi-account AWS structures with Terraform and Ansible defining the infrastructure as code. CI/CD pipelines push changes through Concourse, ArgoCD, and GitLab, and an internal Operations Control Plane orchestrates provisioning and lifecycle management across regions.