aws iotanalytics

AWS IoT Analytics allows you to collect large amounts of device data, process messages, and store them. You can then query the data and run sophisticated analytics on it. AWS IoT Analytics enables advanced data exploration through integration with Jupyter Notebooks and data visualization through integration with Amazon QuickSight. Traditional analytics and business intelligence tools are designed to process structured data. IoT data often comes from devices that record noisy processes (such as temperature, motion, or sound). As a result the data from these devices can have significant gaps, corrupted messages, and false readings that must be cleaned up before analysis can occur. Also, IoT data is often only meaningful in the context of other data from external sources. AWS IoT Analytics automates the steps required to analyze data from IoT devices. AWS IoT Analytics filters, transforms, and enriches IoT data before storing it in a time-series data store for analysis. You can set up the service to collect only the data you need from your devices, apply mathematical transforms to process the data, and enrich the data with device-specific metadata such as device type and location before storing it. Then, you can analyze your data by running queries using the built-in SQL query engine, or perform more complex analytics and machine learning inference. AWS IoT Analytics includes pre-built models for common IoT use cases so you can answer questions like which devices are about to fail or which customers are at risk of abandoning their wearable devices

Subcommands

NameDescription
batch-put-messageSends messages to a channel
cancel-pipeline-reprocessingCancels the reprocessing of data through the pipeline
create-channelCreates a channel. A channel collects data from an MQTT topic and archives the raw, unprocessed messages before publishing the data to a pipeline
create-datasetCreates a dataset. A dataset stores data retrieved from a data store by applying a queryAction (a SQL query) or a containerAction (executing a containerized application). This operation creates the skeleton of a dataset. The dataset can be populated manually by calling CreateDatasetContent or automatically according to a trigger you specify
create-dataset-contentCreates the content of a data set by applying a queryAction (a SQL query) or a containerAction (executing a containerized application)
create-datastoreCreates a data store, which is a repository for messages
create-pipelineCreates a pipeline. A pipeline consumes messages from a channel and allows you to process the messages before storing them in a data store. You must specify both a channel and a datastore activity and, optionally, as many as 23 additional activities in the pipelineActivities array
delete-channelDeletes the specified channel
delete-datasetDeletes the specified dataset. You do not have to delete the content of the dataset before you perform this operation
delete-dataset-contentDeletes the content of the specified dataset
delete-datastoreDeletes the specified data store
delete-pipelineDeletes the specified pipeline
describe-channelRetrieves information about a channel
describe-datasetRetrieves information about a dataset
describe-datastoreRetrieves information about a data store
describe-logging-optionsRetrieves the current settings of the AWS IoT Analytics logging options
describe-pipelineRetrieves information about a pipeline
get-dataset-contentRetrieves the contents of a data set as presigned URIs
list-channelsRetrieves a list of channels
list-dataset-contentsLists information about data set contents that have been created
list-datasetsRetrieves information about data sets
list-datastoresRetrieves a list of data stores
list-pipelinesRetrieves a list of pipelines
list-tags-for-resourceLists the tags (metadata) that you have assigned to the resource
put-logging-optionsSets or updates the AWS IoT Analytics logging options. If you update the value of any loggingOptions field, it takes up to one minute for the change to take effect. Also, if you change the policy attached to the role you specified in the roleArn field (for example, to correct an invalid policy), it takes up to five minutes for that change to take effect
run-pipeline-activitySimulates the results of running a pipeline activity on a message payload
sample-channel-dataRetrieves a sample of messages from the specified channel ingested during the specified timeframe. Up to 10 messages can be retrieved
start-pipeline-reprocessingStarts the reprocessing of raw message data through the pipeline
tag-resourceAdds to or modifies the tags of the given resource. Tags are metadata that can be used to manage a resource
untag-resourceRemoves the given tags (metadata) from the resource
update-channelUpdates the settings of a channel
update-datasetUpdates the settings of a data set
update-datastoreUpdates the settings of a data store
update-pipelineUpdates the settings of a pipeline. You must specify both a channel and a datastore activity and, optionally, as many as 23 additional activities in the pipelineActivities array