# AWS Glossary
This page provides a basic reference to cloud computing concepts and AWS terminology, with links to the official AWS documentation provided.
S3 Buckets: Amazon Simple Storage Service (S3) (opens new window) is designed for data storage in the cloud. Users create an S3 bucket within a specific AWS Region via the AWS Command Line Interface (CLI) (opens new window) or the AWS Management Console (opens new window) web interface. This location is then used much like a standard subdirectory to upload, store, and download data.
EC2 Instances: Amazon Elastic Compute Cloud (Amazon EC2) (opens new window) provides the computational resource for AWS users, which come in a variety of instance sizes, operating systems, number of cores, and prices.
EBS Volumes: Amazon Elastic Block Store (Amazon EBS) (opens new window) is the storage volumes used for EC2 instances for computation.
AWS Batch: Amazon Batch (opens new window) dynamically provisions the optimal quantity and type of compute resources (both Spot or On-Demand) necessary for analysis. It functions analogously to a relatively simple job scheduler used by HPC environments (e.g. LSF or SGE) to schedule and execute batch computing workloads. You can read this blog post (opens new window) for more information related to using AWS Batch for running Genomics Workflows.
EBS Autoscaling: Autoscalling EBS (opens new window) EC2 instances have a running process which monitors disk usage and add more EBS volumes on the fly to expand the free space based on the capacity threshold. This feature increass in volume size, adjustments to performance, or changes in the volume type while the volume is in use, e.g. the disk size will increase automatically while processes are running.