Tips on how to retailer information with AWS IoT SiteWise Edge in lots of places


Introduction

On this publish, we talk about how AWS IoT SiteWise and AWS IoT SiteWise Edge can be utilized to retailer information not solely within the AWS IoT SiteWise information retailer but in addition in lots of different places. By default information is saved within the AWS IoT SiteWise information retailer on AWS.

Prospects informed us that they need to use AWS IoT SiteWise to gather their industrial information from OPC-UA data sources. However not all prospects need to retailer their information solely within the AWS IoT SiteWise information retailer. On this weblog publish, we describe howto retailer information in different companies like Amazon S3, Amazon Timestream or to eat the information in prospects on-premise setting.

AWS IoT SiteWise is a managed service that permits you to gather, mannequin, analyze, and visualize information from industrial tools at scale. An AWS IoT SiteWise gateway collects information from industrial tools and shops information within the AWS IoT SiteWise information retailer within the cloud.

AWS IoT SiteWise Edge brings options of AWS IoT SiteWise within the cloud to the client’s premises. You possibly can course of information within the AWS IoT SiteWise gateway regionally, and visualize tools information utilizing native AWS IoT SiteWise Monitor dashboards served from the AWS IoT SiteWise gateway.

By default, information is saved within the AWS IoT SiteWise information retailer on AWS.

On this weblog publish, we describe how prospects can notice the advantages of the AWS IoT SiteWise Edge gateway to gather information however retailer information outdoors of the AWS IoT SiteWise information retailer.

Time to learn             8 min
Studying degree          300
Companies used           AWS IoT SiteWise Edge, AWS IoT Greengrass, Amazon Kinesis Data Streams, Amazon Timestream

Resolution

Deploying AWS IoT SiteWise Edge gateway on AWS IoT Greengrass Model 2

I’m going to elucidate how the AWS IoT SiteWise Edge gateway is deployed on AWS IoT Greengrass Version 2.

The AWS IoT SiteWise Edge gateway runs in type of parts on AWS IoT Greengrass Model 2. The Information Assortment Pack contains two parts, the SiteWiseEdgeCollectorOpcua and SiteWiseEdgePublisher. The Information Processing Pack contains the one element SiteWiseEdgeProcessor.

The Information Assortment Pack collects your industrial information and routes it to AWS locations. The Information Processing Pack allows the gateway communication with edge-configured asset fashions and belongings. You should use edge configuration to manage what asset information to compute and course of regionally. You possibly can then ship your information to AWS IoT SiteWise or different AWS companies within the cloud.

The next screenshot exhibits an AWS IoT Greengrass V2 deployment with the Information Assortment Pack and Information Processing Pack deployed.

Determine 1: AWS IoT Greengrass V2 deployment

Understanding AWS IoT SiteWise gateway structure

To ship information to places apart from the AWS IoT SiteWise information retailer, you first want to grasp the default structure of the AWS IoT SiteWise gateway.

Information is ingested into the AWS IoT SiteWise information retailer. Information is collected by the SiteWiseEdgeCollectorOpcua from OPC-UA sources and ingested into an AWS IoT Greengrass stream on the gateway, by default to the SiteWise_Stream. The SiteWiseEdgePublisher reads the information from the stream and transfers it to the SiteWise information retailer on AWS.

Determine 2: AWS IoT SiteWise gateway structure

Configuring locations within the AWS IoT SiteWise gateway to retailer information in lots of places

To ship information to a vacation spot apart from the AWS IoT SiteWise information retailer, the gateway configuration lets you configure the AWS IoT Greengrass stream title the place the SiteWiseEdgeCollectorOpcua shops the information. You outline the stream title for every information supply in your AWS IoT SiteWise gateway. You should use the AWS IoT SiteWise console, the AWS CLI or AWS SDK to configure the stream title.

You possibly can create your personal customized stream on AWS IoT Greengrass V2 and level the vacation spot for an information supply to that stream. A stream can have an export definition, which defines the AWS vacation spot to which your information can be transferred. At the moment, AWS IoT SiteWise, AWS IoT Analytics, Amazon S3, and Amazon Kinesis Data Streams are supported as export configurations. Whenever you export your information to Amazon Kinesis Information Streams, you’ve got many choices to read the data from Amazon Kinesis Data Streams and switch it to a different service. With consumers studying information from Amazon Kinesis Information Streams, you’ll be able to ship your information to totally different places.

If you need for instance to retailer your information in Amazon Timestream you should utilize an AWS Lambda function or Amazon Kinesis Data Analytics for Apache Flink as a shopper for Amazon Kinesis Information Streams and write the information into your Amazon Timestream desk.

With such an structure, you cannot solely retailer your information in Amazon Timestream but in addition in any location which is accessible out of your Amazon Kinesis Information Streams shopper.

In case you aren’t utilizing an export configuration for a customized stream, you’ll be able to develop your personal AWS IoT Greengrass component to eat information out of your customized stream.

Determine 3: Structure to retailer information in lots of places with AWS IoT SiteWise

Understanding AWS IoT SiteWise Edge gateway structure

The AWS IoT SiteWise Edge gateway structure differs from the AWS IoT SiteWise gateway structure in that it contains the SiteWiseEdgeProcessor, which lets you serve AWS IoT SiteWise Monitor portals on the edge and in addition course of information on the edge.

Determine 4: AWS IoT SiteWise Edge gateway structure

To ship information from AWS IoT SiteWise Edge to many places you can’t use the identical strategy as with AWS IoT SiteWise. A customized stream for an information supply defines the place the SiteWiseEdgeCollectorOpcua sends the information to. The Information Processing Pack already makes use of the customized stream title SiteWise_Edge_Stream. In case you modified the stream title to your customized stream, then your information wouldn’t attain the SiteWiseEdgeProcessor.

Configure AWS IoT SiteWise Edge to retailer information in lots of places

There are a number of choices to ship information from AWS IoT SiteWise Edge to many places. If you do not need to ship information to the AWS IoT SiteWise information retailer you should take away the SiteWiseEdgePublisher out of your AWS IoT Greengrass deployment, as a result of the SiteWiseEdgePublisher reads information from the SiteWise_Stream and shops it within the AWS IoT SiteWise information retailer.

You should use the API on the edge to retrieve information and retailer it, for instance, in a stream on AWS IoT Greengrass for additional processing. This feature requires you to question the API for each single asset property, and in case your asset properties change, you should additionally change your utility or the applying’s configuration.

An alternative choice is to develop a element to learn information from the SiteWise_Stream. The element transfers the information to a different vacation spot equivalent to one other stream or a goal in your on-premises setting.

Determine 5: Structure to retailer information in lots of places with AWS IoT SiteWise Edge

Within the following instance we clarify how one can learn information from the SiteWise_Stream and in a single case, ingest the information to a customized stream to be transferred to AWS, and in one other case, publish the information to an area MQTT message dealer. The customized stream is created with an export configuration to Amazon Kinesis Information Streams on AWS.

The next code snippets are based mostly on an AWS IoT Greengrass V2 element written in Python. The code makes use of the AWS Greengrass Stream Manager SDK for Python and the Paho Python Client.

The next variables are used within the customized element.

  • STREAM_NAME_SOURCE is the title of the stream to learn the information from.
  • STREAM_NAME_TARGET is the title of your customized stream the place you need to ship the information to.
  • STREAM_NAME_CLOUD is the title of Amazon Kinesis Information Streams on AWS. The stream STREAM_NAME_TARGET is created with an export configuration to the STREAM_NAME_CLOUD.

For instance:

STREAM_NAME_SOURCE = "SiteWise_Stream"
STREAM_NAME_TARGET = "SiteWise_Anywhere_Stream"
STREAM_NAME_CLOUD = "SiteWiseToKinesisDatastream"

Earlier than beginning the element you should create an Amazon Kinesis Information Stream with stream title STREAM_NAME_CLOUD on AWS.

Upon begin, the element checks if the stream STREAM_NAME_TARGET exists. If the stream doesn’t exist, it’s created with an export configuration to Amazon Kinesis Information Streams on AWS.

attempt:
    response = stream_manager_client.describe_message_stream(STREAM_NAME_TARGET)
    logger.information("stream_name: %s particulars: %s", STREAM_NAME_TARGET, response)
besides ResourceNotFoundException as error:
    logger.information("create message stream: %s error: %s", STREAM_NAME_TARGET, error)
    
    exports = ExportDefinition(
        kinesis=[KinesisConfig(
            identifier=f"{STREAM_NAME_CLOUD}",
            kinesis_stream_name=STREAM_NAME_CLOUD,
            batch_size=10,
            batch_interval_millis=60000
            )]
        )
    
    stream_manager_client.create_message_stream(
        MessageStreamDefinition(
            title=STREAM_NAME_TARGET,
            strategy_on_full=StrategyOnFull.OverwriteOldestData,
            persistence=Persistence.File,
            max_size=1048576,
            export_definition=exports
        )
    )
besides Exception as error:
        logger.error("%s", error)

The element reads messages from the STREAM_NAME_SOURCE. As soon as messages can be found it iterates over the entries in a message and begins threads to retailer the entries in a customized stream and to publish them to an MQTT message dealer.

response = stream_manager_client.read_messages(
            STREAM_NAME_SOURCE,
            ReadMessagesOptions(
                desired_start_sequence_number=LAST_READ_SEQ_NO + 1,
                min_message_count=MIN_MESSAGE_COUNT,
                read_timeout_millis=1000
            )
        )

for entry in response:
    logger.information("stream_name: %s payload: %s",
                STREAM_NAME_SOURCE, entry.payload)

   # ship information to a different stream on the edge
    thread_stream = Thread(
        goal=store_message_to_stream,
        args=[entry.payload])
    thread_stream.begin()
    logger.information('thread_stream began: %s', thread_stream)
    
   # ship information to an area MQTT message dealer
    thread_mqtt = Thread(
        goal=publish_message_to_mqtt_broker,
        args=[entry.payload])
    thread_mqtt.begin()
    logger.information('thread_mqtt began: %s', thread_mqtt)

The next operate code writes information to the customized stream STREAM_NAME_TARGET. Information ingested on this customized stream is transferred robotically to Amazon Kinesis Information Streams on AWS.

def store_message_to_stream(payload):
    attempt:
        sequence_number = stream_manager_client.append_message(stream_name=STREAM_NAME_TARGET, information=payload)
        logger.information('appended message to stream: %s sequence_number: %s message: %s',
                    STREAM_NAME_TARGET, sequence_number, payload)
    besides Exception as error:
        logger.error("append message to stream: %s: %s",
                     STREAM_NAME_TARGET, error)

The next operate code publishes information to the subject sitewise on an MQTT message dealer.

def publish_message_to_mqtt_broker(payload):
    attempt:
        logger.information('MQTT: publish message: %s', payload)
        c_mqtt = paho.Consumer()
        c_mqtt.mqtt_on_publish = mqtt_on_publish
        c_mqtt.mqtt_on_disconnect = mqtt_on_disconnect
        c_mqtt.join(MQTT_BROKER, MQTT_PORT)
        ret = c_mqtt.publish("sitewise", payload) 
        logger.information('MQTT: publish: ret: %s', ret)
        c_mqtt.disconnect()
    besides Exception as error:
        logger.error("MQTT: publish message: %s", error)

Conclusion

On this weblog, you’ve got discovered how you should utilize an AWS IoT SiteWise gateway to gather information out of your industrial tools and ship it to many places. You’ve got discovered learn how to configure your gateway to ship information from AWS IoT SiteWise or AWS IoT SiteWise Edge to a customized vacation spot. Based mostly on pattern code, you’ve got seen how one can switch your information to a customized location on AWS and into your on-premise setting. Study extra on the AWS IoT SiteWise product page or on the AWS IoT SiteWise workshops.

In regards to the creator

Philipp Sacha

Philipp is a Specialist Options Architect for IoT at Amazon Internet Companies supporting prospects within the IoT space. He joined AWS in 2015 as a common Options Architect and moved in 2018 into the position of a Specialist within the IoT space.



Source link

LINXGO
Logo
Compare items
  • Total (0)
Compare