For testing sometimes it is too hard to deal with security :-). To make the OPC UA server in WinCC OA unsecure add the following lines to the config file.
Connected Python to WinCC OA through a Websocket Manager. Python programs can connect to WinCC OA and read/write datapoints. Communication is JSON based, it’s simple to use in Python, see examples below (ws://rocworks.no-ip.org can be used for tests, but will not be available all the time).
To learn how deep learning works I decided to implement a Multilayer Neural Network with Backpropagation by my own in Clojure (I did NOT use a library like Tensor Flow or Deeplearning4j). The link between Clojure and the SCADA system WinCC OA is oa4j. With that connection the Neural Network can be used and trained with sensor data collected by the SCADA system …
KSQL makes it easy to read, write, and process streaming data in real-time, at scale, using SQL-like semantics. It offers an easy way to express stream processing transformations as an alternative to writing an application in a programming language such as Java or Python. https://www.confluent.io/product/ksql/
With WinCC OA Java (https://github.com/vogler75/oa4j) we can stream data from WinCC OA to Apache Kafka, use KSQL to produce some insights and send it back to WinCC OA by using a WinCC OA Driver written in Java connected to Kafka.
Attached you will find a docker-compose.yml to setup KSQL + WinCC OA Connector and Driver to test it. Just use “docker-compose up -d” to start up everything. Before you should set the “data” and “event” environment variables in the docker-compose.yml to point to a running WinCC OA project.
root@docker1:~/docker/builds/winccoa# docker-compose up -d
Creating winccoa_frontend_1 ==> collect data from OA and publish it by ZeroMQ
Creating winccoa_backend-kafka_1 ==> get the data from the Frontend and write it to Kafka
Creating winccoa_driver-kafka_1 ==> OA driver to read data from kafka.
Creating winccoa_zookeeper_1
Creating winccoa_kafka_1
Creating winccoa_schema-registry_1
Creating winccoa_ksql-cli_1
We use Docker to startup WinCCOA Mangers (frontend, backend) and Drivers.
Afterwards you can start KSQL: docker-compose exec ksql-cli ksql-cli local –bootstrap-server kafka:29092
Create a stream of the topic which is sent from WinCC OA to kafka (currently every change of value in WinCC OA is sent to Kafka):
CREATE STREAM Scada_FloatVar (TimeMS BIGINT, Status BIGINT, Value DOUBLE) WITH (kafka_topic=’Scada_FloatVar’, value_format=’JSON’);
Create a result table in KSQL which will be read by the WinCC OA Driver, here we detect if a datapoint changes more often than 5 times in 10 seconds. Just a simple example to show how KSQL can be used:
CREATE TABLE result WITH (PARTITIONS=1) AS SELECT rowkey AS “Name”, count(*) AS “Value” FROM Scada_FloatVar WINDOW TUMBLING (size 10 second) GROUP BY rowkey HAVING count(*) > 5;
Get alerts unsolicited as message (push notification), ask the bot about the current state of your system, query information and send commands to the WinCC OA Bot…
This can be easily achieved by connecting WinCC OA to NodeRed by MQTT. From Node Red it is possible to connect to Twitter and publish tweets. I installed a MQTT server Mosquitto on my SCADA server and connected WinCC OA to this MQTT broker, so that I can send a datapoint value from WinCC OA to MQTT. Node-Red can read this tag from MQTT and publish the message to Twitter…
For example we do this with the Oracle Alert Log. Very often an Oracle Database is used with WinCC OA to store history values. But a lot of times no one takes care of the Oracle database. At least the Alert-Log file should be observed. With Logstash, Apache Kafka and the WinCC OA Apache Kafka Driver we can send alert log messages from the Oracle database(s) to a WinCC OA monitoring system.
With Logstash we can collect the logs of WinCC OA systems and write it to Elasticsearch. Multiple WinCC OA system’s can be observed with a central log database…
With Kibana the logs can be easily discovered – I now see errors what i haven’t seen before in my system…
In parallel the log messages are written to Apache Kafka. With Apache Spark we can now observe the log streams and detect anomalies… a very simple observation could be to just simple count the amount of log messages per timeframe …