aws iotanalytics
AWS IoT Analytics allows you to collect large amounts of device data, process messages, and store them. You can then query the data and run sophisticated analytics on it. AWS IoT Analytics enables advanced data exploration through integration with Jupyter Notebooks and data visualization through integration with Amazon QuickSight. Traditional analytics and business intelligence tools are designed to process structured data. IoT data often comes from devices that record noisy processes (such as temperature, motion, or sound). As a result the data from these devices can have significant gaps, corrupted messages, and false readings that must be cleaned up before analysis can occur. Also, IoT data is often only meaningful in the context of other data from external sources. AWS IoT Analytics automates the steps required to analyze data from IoT devices. AWS IoT Analytics filters, transforms, and enriches IoT data before storing it in a time-series data store for analysis. You can set up the service to collect only the data you need from your devices, apply mathematical transforms to process the data, and enrich the data with device-specific metadata such as device type and location before storing it. Then, you can analyze your data by running queries using the built-in SQL query engine, or perform more complex analytics and machine learning inference. AWS IoT Analytics includes pre-built models for common IoT use cases so you can answer questions like which devices are about to fail or which customers are at risk of abandoning their wearable devices
Subcommands
Name | Description |
---|---|
batch-put-message | Sends messages to a channel |
cancel-pipeline-reprocessing | Cancels the reprocessing of data through the pipeline |
create-channel | Creates a channel. A channel collects data from an MQTT topic and archives the raw, unprocessed messages before publishing the data to a pipeline |
create-dataset | Creates a dataset. A dataset stores data retrieved from a data store by applying a queryAction (a SQL query) or a containerAction (executing a containerized application). This operation creates the skeleton of a dataset. The dataset can be populated manually by calling CreateDatasetContent or automatically according to a trigger you specify |
create-dataset-content | Creates the content of a data set by applying a queryAction (a SQL query) or a containerAction (executing a containerized application) |
create-datastore | Creates a data store, which is a repository for messages |
create-pipeline | Creates a pipeline. A pipeline consumes messages from a channel and allows you to process the messages before storing them in a data store. You must specify both a channel and a datastore activity and, optionally, as many as 23 additional activities in the pipelineActivities array |
delete-channel | Deletes the specified channel |
delete-dataset | Deletes the specified dataset. You do not have to delete the content of the dataset before you perform this operation |
delete-dataset-content | Deletes the content of the specified dataset |
delete-datastore | Deletes the specified data store |
delete-pipeline | Deletes the specified pipeline |
describe-channel | Retrieves information about a channel |
describe-dataset | Retrieves information about a dataset |
describe-datastore | Retrieves information about a data store |
describe-logging-options | Retrieves the current settings of the AWS IoT Analytics logging options |
describe-pipeline | Retrieves information about a pipeline |
get-dataset-content | Retrieves the contents of a data set as presigned URIs |
list-channels | Retrieves a list of channels |
list-dataset-contents | Lists information about data set contents that have been created |
list-datasets | Retrieves information about data sets |
list-datastores | Retrieves a list of data stores |
list-pipelines | Retrieves a list of pipelines |
list-tags-for-resource | Lists the tags (metadata) that you have assigned to the resource |
put-logging-options | Sets or updates the AWS IoT Analytics logging options. If you update the value of any loggingOptions field, it takes up to one minute for the change to take effect. Also, if you change the policy attached to the role you specified in the roleArn field (for example, to correct an invalid policy), it takes up to five minutes for that change to take effect |
run-pipeline-activity | Simulates the results of running a pipeline activity on a message payload |
sample-channel-data | Retrieves a sample of messages from the specified channel ingested during the specified timeframe. Up to 10 messages can be retrieved |
start-pipeline-reprocessing | Starts the reprocessing of raw message data through the pipeline |
tag-resource | Adds to or modifies the tags of the given resource. Tags are metadata that can be used to manage a resource |
untag-resource | Removes the given tags (metadata) from the resource |
update-channel | Updates the settings of a channel |
update-dataset | Updates the settings of a data set |
update-datastore | Updates the settings of a data store |
update-pipeline | Updates the settings of a pipeline. You must specify both a channel and a datastore activity and, optionally, as many as 23 additional activities in the pipelineActivities array |