SQLstream s-Server ships with several demo applications and scripts that stream demonstration data. Each is designed to illustrate one or more particular aspects of the SQLstream system, its tools, or a type of streaming data application.
The demos are as follows:
Source code, scripts, and other files needed for each demo are in the directory s-Server/demo/ in the SQLstream installation directory.
You can use datagen to generate sample data.
There are also several scripts for streaming sample data in s-Server/demo/data:
These stream sample data from buses in the Sydney area is flowing into a file located at /tmp/buses.log at 50 rows per second. This simulates a log file being continually updated.
Script | Definition |
---|---|
StreamBusData.sh | Starts streaming buses data in CSV format to /tmp/buses.log. |
StopBusData.sh | Stops streaming in CSV format to /tmp/buses.log. |
StreamJsonBusData.sh | Starts streaming buses data in JSON format to /tmp/buses.log. |
StopJsonBusData.sh | Stops streaming buses data in JSON format to /tmp/buses.log. |
StreamXmlBusData.sh | Starts streaming buses data in XML format to /tmp/buses.log. |
StopXmlBusData.sh | Stops streaming buses data in XML format to /tmp/buses.log. |
These scripts stream data in the following categories:
Column | Type | Definition |
---|---|---|
id | DOUBLE | Identification number for the bus. |
reported_at | TIMESTAMP | Time location was reported. |
shift_no | DOUBLE | Shift number for the bus's driver. |
driver_no | DOUBLE | Driver identification for number. |
prescribed | VARCHAR(4096) | The direction on the motorway (into Sydney or out of Sydney). |
highway | DOUBLE | Highway number if available. |
gps | VARCHAR | GPS information with latitude, longitude, and bearing in JSON format. |
This demonstration script streams simulated environmental sensor data from around the world data into a Kafka topic named IoT. If you have installed SQLstream in a Docker container or appliance, you can start this demonstration script on the Guavus SQLstream cover page.
To run the IoT demonstration data on a Linux install, you need to have installed:
To install a Kafka broker, you can download it from here: https://kafka.apache.org/downloads
and follow the quickstart instructions here https://kafka.apache.org/quickstart to start zookeeper and Kafka.
The IoT demonstration script assumes that Kafka is installed at /opt/Kafka.
To install the kafkacat utility. run one of the following commands:
For Ubuntu:
sudo apt-get install -y kafkacat
For Centos:
sudo yum install kafkacat
Once you have Kafka and Kafkacat installed, you can run the IoT demonstration data as follows:
Command | Result |
---|---|
$SQLSTREAM_HOME/demo/IoT/start.sh | Streams data to a Kafka topic named IoT. |
$SQLSTREAM_HOME/demo/IoT/stop.sh | Stops streaming data to the IoT topic. |
$SQLSTREAM_HOME/demo/IoT/status.sh | Checks the status of the script. |
To start the demo, run
$SQLSTREAM_HOME/demo/IoT/start.sh
The script will return something along the following lines:
starteddrew@drew-VirtualBox:/$
Keep this terminal window open for as long as you want data to stream.
To confirm that data is streaming, open a new terminal and run the following (assuming that you have installed Kafka at /opt/Kafka)
/opt/Kafka/bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic IoT --from-beginning
This demonstration script streams simulated telemetry data from video players around the world into a Kafka topic named cdn. If you have installed SQLstream in a Docker container or appliance, you can start this demonstration script on the Guavus SQLstream cover page.
To run the cdn demonstration data on a Linux install, you need to have installed:
To install a Kafka broker, you can download it from here: https://kafka.apache.org/downloads
and follow the quickstart instructions here https://kafka.apache.org/quickstart to start zookeeper and Kafka.
The IoT demonstration script assumes that Kafka is installed at /opt/Kafka.
To install the kafkacat utility. run one of the following commands:
For Ubuntu:
sudo apt-get install -y kafkacat
For Centos:
sudo yum install kafkacat
Once you have Kafka and Kafkacat installed, you can run the cdn demonstration data as follows:
Command | Result |
---|---|
$SQLSTREAM_HOME/demo/cdn/start.sh | Streams data to a Kafka topic named cdn. |
$SQLSTREAM_HOME/demo/cdn/stop.sh | Stops streaming data to the cdn topic. |
$SQLSTREAM_HOME/demo/cdn/status.sh | Checks the status of the script. |
To start the demo, run
$SQLSTREAM_HOME/demo/cdn/start.sh
The script will return something along the following lines:
starteddrew@drew-VirtualBox:/$
Keep this terminal window open for as long as you want data to stream.
To confirm that data is streaming, open a new terminal and run the following (assuming that you have installed Kafka at /opt/Kafka)
/opt/Kafka/bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic cdn --from-beginning