What could be the reasons of OPC-UA Collector taking a lot of time to parse the nodes?

0

We are using Greengrass V2 to ingest data from OPC-UA Server, and in a particular site we observed that the OPC-UA Collector takes a lot of time (more than 1-2 hours) to get started. We have 2 different data-sources defined in this case with same OPC-UA Server as source of data ingestion but different measurements for different sinks (Sitewise and Kinesis).

As per logs we suspect that it takes time to parse the nodes for both data sources considering both of them as separate Servers, and among them it prioritise the Sitewise first, where Kinesis Sink is more crucial for our use-case and as per our current understanding we can not provide the priorities among the data-sources. Even with feature of priority, it takes a huge time to fetch nodes.

What could be the reasons of OPC-UA Collector taking long time to get started, how to confirm the reason and how to solve this problem? Is there any optimisation we need to follow for Defining Data Source?

Connectivity and Configuration :

  • Greengrass Machine is c6in.xlarge EC2 Machine on Cloud.
  • The actual OPC-UA Server lies in remote machine, which is in a VPN link to the greengrass machine.
  • Greengrass Version: 2.8.1
  • SiteWiseEdgeCollectorOpcua Version: 2.1.3
  • Node Structure: Object -> R1 -> R2 -> {device}_{measurement}, where R1 and R2 are directory nodes, and under R2 we have more than 4000 tags of name pattern {device}_{measurement}
  • Data Source for Greengrass stream are defined as following:
    • Node ID for selection: /R1/R2/D*
    • Node Paths for Group Properties: /R1/R2/D*
2 Answers
0

Hello Sahil, I experienced similar issues when my OPC-UA server had tags configured as datatypes other than what can be ingested given that Sitewise presently only consumes String, Integer, Double, Boolean ( https://docs.aws.amazon.com/iot-sitewise/latest/userguide/measurements.html)
My OPC-UA server was publishing NaN, String Arrays, Long data types etc, and greengrass services seemed to be busy filtering data and rejecting data (look in the rejected log for hints on dropped tags) . The issue was resolved by deploying the latest OPC-UA collector component and forcing the OPC-UA server to publish what it could, according to sitewise datatypes, and just have to live with not being able to ingest OPC arrays at this point in time, whilst awaiting AWS to develop. Presently ingesting ~35,000 tags per second

JasonH
answered a year ago
  • Thanks for response JasonH, I believe this also comes as ConversionErrors in OPC-UA Collector Logs and we are receiving log which defines the value of this to be 0. Also, the measurements we have are mostly double or integer only. Along with that, I believe conversion issue will occur once the collector starts getting the data, which is after it have created required groups, and that is where it takes a lot of time.

  • JasonH, If you can, could you please share the network configuration in your setup and what performance do you see in-terms of latency?

0

Can you see a RejectedPropertyValues.log file in further insight in /greengrass/v2/work/aws.iot.SiteWiseEdgePublisher/exports (i'm using a linux edge) Latency wise, network <500ms from opc server (remote minesite in the Australian outback!) to greengrass client/opcua collector.
Overall, the sitewise datastreams are approximately 1 minutes lagging from remote opc-server to Amazon Managed Grafana visualisation.

JasonH
answered a year ago
  • I actually did not checked that, the SiteWise data is our secondary source which we use for a different use-case and so, not for full data. Our main point of issue is for Kinesis, which is a greengrass stream-manager's stream for OPC-UA Collector. We did checked the /greengrass/v2/work/aws.greengrass.StreamManager/greengrass_stream and observed that the data does not appear for first 3 hours and then when it starts appearing latest data entered in the files is of 10-15mins old. And that's why I suspect it's an issue with OPC-UA Collector.

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions