Skip to content

Commit 7b4cf31

Browse files
authored
Merge pull request #232 from quixio/dev
Docs Release 2023-12-001
2 parents 6155e09 + d04ce2a commit 7b4cf31

40 files changed

+955
-2
lines changed

docs/get-started/project-templates.md

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -36,4 +36,14 @@ Project templates also provide a starting point for your own projects - our proj
3636

3737
[Explore :octicons-arrow-right-24:](../tutorials/clickstream/overview.md)
3838

39+
- __Predictive maintenance__
40+
41+
---
42+
43+
![Predictive maintenance pipeline](../images/project-templates/predictive-maintenance-pipeline.png)
44+
45+
Predicts failures in 3D printers.
46+
47+
[Explore :octicons-arrow-right-24:](../tutorials/predictive-maintenance/overview.md)
48+
3949
</div>
86 KB
Loading

docs/tutorials/currency-alerting/currency-alerting.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -145,13 +145,13 @@ This microservice reads from the `currency-rate-alerts` topic and whenever a new
145145

146146
It also reads the contents of the message and enriches the notification with details on how the threshold was crossed, that is, whether the price is moving up or down.
147147

148-
To set up the push nonfiction microservice, follow these steps:
148+
To set up the push notification microservice, follow these steps:
149149

150150
1. Click on the `Code Samples` icon in the left-hand navigation.
151151

152152
2. In the search box on the Code Samples page, enter "Pushover".
153153

154-
You will see the `Threshold Alert` sample appear in the search results:
154+
You will see the `Pushover Output` sample appear in the search results:
155155

156156
![Pushover Notifications](./images/library-pushover.png "Pushover Notifications")
157157

docs/tutorials/overview.md

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -44,6 +44,18 @@ Some tutorials use [project templates](../get-started/project-templates.md) - th
4444

4545
[Explore :octicons-arrow-right-24:](../tutorials/clickstream/overview.md)
4646

47+
- __Predictive maintenance__
48+
49+
---
50+
51+
![Predictive maintenance pipeline](../images/project-templates/predictive-maintenance-pipeline.png)
52+
53+
`Project template`
54+
55+
Predicts failures in 3D printers.
56+
57+
[Explore :octicons-arrow-right-24:](../tutorials/predictive-maintenance/overview.md)
58+
4759
- __Train and deploy machine learning (ML)__
4860

4961
---
Lines changed: 104 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,104 @@
1+
# Alert service
2+
3+
Sends alerts to an output topic when the temperature is under or over the threshold.
4+
5+
It receives data from two topics (3d printer data and forecast) and triggers an alert (to output topic `alerts`) if the temperature is under or over the threshold.
6+
7+
![pipline section](./images/alert-pipeline-segment.png)
8+
9+
The default thresholds are as shown in the following table:
10+
11+
| Variable | Default value (degrees C)|
12+
|----|----|
13+
| min_ambient_temperature | 45|
14+
| max_ambient_temperature | 55 |
15+
| min_bed_temperature | 105 |
16+
| max_bed_temperature | 115 |
17+
| min_hotend_temperature | 245 |
18+
| max_hotend_temperature | 255 |
19+
20+
These thresholds are used to determine if the temperature, or forecast temperature, are under or over the threshold values. If so these alerts are published to the `alerts` topic.
21+
22+
Note there are different alert types, with the message format for the `no-alert` type alert:
23+
24+
``` json
25+
{
26+
"status": "no-alert",
27+
"parameter_name": "hotend_temperature",
28+
"message": "'Hotend temperature' is within normal parameters",
29+
"alert_timestamp": 1701280033000000000,
30+
"alert_temperature": 246.04148121958596
31+
}
32+
```
33+
34+
An example of the `under-now` alert message format:
35+
36+
``` json
37+
{
38+
"status": "under-now",
39+
"parameter_name": "bed_temperature",
40+
"alert_timestamp": 1701273328000000000,
41+
"alert_temperature": 104.0852349596566,
42+
"message": "'Bed temperature' is under the threshold (105ºC)"
43+
}
44+
```
45+
46+
Here's an `over-forecast` alert message format:
47+
48+
``` json
49+
{
50+
"status": "over-forecast",
51+
"parameter_name": "forecast_fluctuated_ambient_temperature",
52+
"alert_temperature": 55.014602460947586,
53+
"alert_timestamp": 1701278280000000000,
54+
"message": "'Ambient temperature' is forecasted to go over 55ºC in 1:36:29."
55+
}
56+
```
57+
58+
Here's the `under-forecast` alert message format:
59+
60+
``` json
61+
{
62+
"status": "under-forecast",
63+
"parameter_name": "forecast_fluctuated_ambient_temperature",
64+
"alert_temperature": 44.98135836928914,
65+
"alert_timestamp": 1701277320000000000,
66+
"message": "'Ambient temperature' is forecasted to fall below 45ºC in 1:20:28."
67+
}
68+
```
69+
70+
These alerts are subscribed to by the Printers dashboard service, and the alerts can be displayed in real time on the scrolling charts, as well as the scrolling alert display:
71+
72+
![Alerts](./images/alerts-display.png)
73+
74+
## Check the log messages
75+
76+
It can be very useful to check the logs for a service. To do this from the pipeline view:
77+
78+
1. Click on Alert Service in the pipeline view.
79+
80+
2. Click the `Logs` tab.
81+
82+
You can now view the log messages:
83+
84+
![Log messages](./images/alert-service-logging.png)
85+
86+
## View the message format
87+
88+
You can also view the actual messages being transferred through the service:
89+
90+
1. Click the `Messages` tab.
91+
92+
2. You can now select either the input or oputput topic as required from the topic drop down:
93+
94+
![Topic drop down](./images/messages-topic-dropdown.png)
95+
96+
3. You can now explore the messages. Click on a message to display it:
97+
98+
![Message format](./images/message-format.png)
99+
100+
Make sure you click the `Live` tab to continue viewing live messages.
101+
102+
## 🏃‍♀️ Next step
103+
104+
[Part 6 - InfluxDB raw data service :material-arrow-right-circle:{ align=right }](./influxdb-raw-data.md)
Lines changed: 111 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,111 @@
1+
# Data generator
2+
3+
This service generates temperature data simulating one or more 3D printers. It simulates three temperature sensors on a fleet of 3D printers.
4+
5+
![data generator pipeline segment](./images/data-generator-pipeline-segment.png)
6+
7+
For each printer, the enclosure temperature is programmed to decrease starting at the 4 hour point. It will drop below the minimum threshold of 45°C at 5h:47m — the failure point.
8+
9+
The simulation speed is 10x actual spped, so the temperature will start to drop at approximately 24 minutes and cross the minimum threshold at around 34m 44s.
10+
11+
When printing with a heat-sensitive material such as ABS (Acrylonitrile Butadiene Styrene), it’s important to ensure that the temperatures remain stable.
12+
13+
The [forecasting algorithm](./forecast-service.md) that attempts to estimate when this is going to happen, and displays the alert on a dashboard.
14+
15+
## Data published
16+
17+
The generated data is published to the `3d-printer-data` topic:
18+
19+
* Ambient temperature
20+
* Ambient temperature with fluctuations
21+
* Bed temperature
22+
* Hot end temperature
23+
* original_timestamp
24+
* Printer finished printing
25+
26+
This service runs continually.
27+
28+
## Exploring the message format
29+
30+
If you click `Topics` in the main left-hand navigation you see the topics in the environment. Click in the `Data` area to view live data. This takes you into the Quix data explorer. You can then select the stream and parameter data you'd like to explore. You can then view this data in either the `Table` or `Messages` view.
31+
32+
If you look at the messages in the `Messages` view, you'll see data has the following format:
33+
34+
``` json
35+
{
36+
"Epoch": 0,
37+
"Timestamps": [
38+
1701277527000000000
39+
],
40+
"NumericValues": {
41+
"hotend_temperature": [
42+
250.8167407832582
43+
],
44+
"bed_temperature": [
45+
106.9299672495977
46+
],
47+
"ambient_temperature": [
48+
36.92387946005222
49+
],
50+
"fluctuated_ambient_temperature": [
51+
36.92387946005222
52+
]
53+
},
54+
"StringValues": {
55+
"original_timestamp": [
56+
"2023-11-29 17:05:27"
57+
]
58+
},
59+
"BinaryValues": {},
60+
"TagValues": {
61+
"printer": [
62+
"Printer 72"
63+
]
64+
}
65+
}
66+
```
67+
68+
The Quix data explorer is a very useful tool for debugging and monitoring your pipeline.
69+
70+
## Viewing the deployed application
71+
72+
In the left-hand main navigation, click `Deployments` to see all the deployed services and jobs in the environment. Click `Data Generator` to select the deployment. This takes you to an extremely useful screen where you can:
73+
74+
1. View the status of the deployment (such as CPU, memory usage, and replicas assigned).
75+
2. See the live logs for the service.
76+
3. See the topic lineage for the service.
77+
4. Access Build logs (in case of errors when the service is built).
78+
5. Access the Messages tab, where you can then see messages associated with the service in real time.
79+
80+
## Viewing the application code
81+
82+
There are many ways to view the code for the application (which is then deployed as a job or service). The quickest way from the current screen is to click the area shown:
83+
84+
![Go to code view](./images/data-generator-deployment-code-view.png)
85+
86+
You'll now be in the code view with the **version of the deployed code** displayed.
87+
88+
Review the code, you'll see that data is generated for each printer, and each printer has its own stream for generated data:
89+
90+
``` python
91+
tasks = []
92+
printer_data = generate_data()
93+
94+
# Distribute all printers over the data length
95+
delay_seconds = int(os.environ['datalength']) / replay_speed / number_of_printers
96+
97+
for i in range(number_of_printers):
98+
# Set stream ID or leave parameters empty to get stream ID generated.
99+
name = f"Printer {i + 1}" # We don't want a Printer 0, so start at 1
100+
101+
# Start sending data, each printer will start with some delay after the previous one
102+
tasks.append(asyncio.create_task(generate_data_and_close_stream_async(topic_producer, name, printer_data.copy(), delay_seconds * i)))
103+
104+
await asyncio.gather(*tasks)
105+
```
106+
107+
Feel free to explore the code further.
108+
109+
## 🏃‍♀️ Next step
110+
111+
[Part 3 - Downsampling service :material-arrow-right-circle:{ align=right }](./downsampling.md)
Lines changed: 53 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
# Downsampling
2+
3+
This service reduces the sampling rate of data from one per second to one per minute.
4+
5+
![Downsampling pipeline segment](./images/downsampling-pipeline-segment.png)
6+
7+
The service uses a buffer to buffer data for one minute before releasing.
8+
9+
``` python
10+
# buffer 1 minute of data
11+
buffer_configuration = qx.TimeseriesBufferConfiguration()
12+
buffer_configuration.time_span_in_milliseconds = 1 * 60 * 1000
13+
```
14+
15+
During the buffering the data is aggregated in the dataframe handler:
16+
17+
``` python
18+
def on_dataframe_received_handler(originating_stream: qx.StreamConsumer, df: pd.DataFrame):
19+
if originating_stream.properties.name is not None and stream_producer.properties.name is None:
20+
stream_producer.properties.name = originating_stream.properties.name + "-down-sampled"
21+
22+
# Identify numeric and string columns
23+
numeric_columns = [col for col in df.columns if not col.startswith('TAG__') and
24+
col not in ['time', 'timestamp', 'original_timestamp', 'date_time']]
25+
string_columns = [col for col in df.columns if col.startswith('TAG__')]
26+
27+
# Create an aggregation dictionary for numeric columns
28+
numeric_aggregation = {col: 'mean' for col in numeric_columns}
29+
30+
# Create an aggregation dictionary for string columns (keeping the last value)
31+
string_aggregation = {col: 'last' for col in string_columns}
32+
33+
# Merge the two aggregation dictionaries
34+
aggregation_dict = {**numeric_aggregation, **string_aggregation}
35+
36+
df["timestamp"] = pd.to_datetime(df["timestamp"])
37+
38+
# resample and get the mean of the input data
39+
df = df.set_index("timestamp").resample('1min').agg(aggregation_dict).reset_index()
40+
41+
# Send filtered data to output topic
42+
stream_producer.timeseries.buffer.publish(df)
43+
```
44+
45+
You can read more about using buffers in the [buffer documentation](https://quix.io/docs/quix-streams/v0-5-stable/subscribe.html#using-a-buffer).
46+
47+
The aggregated data is published to the output stream (one stream for each printer).
48+
49+
The output topic for the service is `downsampled-3d-printer-data`. Other services such as the Forecast service, and the InfluxDB raw data storage service subscribe to this topic.
50+
51+
## 🏃‍♀️ Next step
52+
53+
[Part 4 - Forecast service :material-arrow-right-circle:{ align=right }](./forecast-service.md)
Lines changed: 68 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,68 @@
1+
# Forecast service
2+
3+
Generates a forecast for the temperature data received from the input topic. This is to predict a potential failure condition, when the ambient temperature of the 3D printer drops below the minimum threshold for successful ABS-based printing.
4+
5+
![Forecast pipeline segment](./images/forecast-pipeline-segment.png)
6+
7+
The forecast is made using the downsampled data as the input, and using the scikit-learn library. The forecasts are published to the `forecast` topic. The Alert service and Printers dashboard service both subscribe to this topic.
8+
9+
## Data format
10+
11+
The forecast data format is:
12+
13+
```json
14+
{
15+
"Epoch": 0,
16+
"Timestamps": [
17+
1701284880000000000,
18+
1701284940000000000,
19+
1701285000000000000,
20+
...
21+
1701313620000000000
22+
],
23+
"NumericValues": {
24+
"forecast_fluctuated_ambient_temperature": [
25+
42.35418149532191,
26+
42.43955555085827,
27+
42.52524883234062,
28+
...
29+
119.79365961797913
30+
]
31+
},
32+
"StringValues": {},
33+
"BinaryValues": {},
34+
"TagValues": {
35+
"printer": [
36+
"Printer 19-down-sampled",
37+
"Printer 19-down-sampled",
38+
"Printer 19-down-sampled",
39+
...
40+
"Printer 19-down-sampled"
41+
]
42+
}
43+
}
44+
```
45+
46+
## Prediction algorithm
47+
48+
The work of the prediction is carried out by the `scikit-learn` library, using a quadratic polynomial (second order) linear regression algorithm:
49+
50+
``` python
51+
forecast_input = df[parameter_name]
52+
53+
# Define the degree of the polynomial regression model
54+
degree = 2
55+
# Create a polynomial regression model
56+
model = make_pipeline(PolynomialFeatures(degree), LinearRegression())
57+
# Fit the model to the data
58+
model.fit(np.array(range(len(forecast_input))).reshape(-1, 1), forecast_input)
59+
# Forecast the future values
60+
forecast_array = np.array(range(len(forecast_input), len(forecast_input) + forecast_length)).reshape(-1, 1)
61+
forecast_values = model.predict(forecast_array)
62+
# Create a DataFrame for the forecast
63+
fcast = pd.DataFrame(forecast_values, columns=[forecast_label])
64+
```
65+
66+
## 🏃‍♀️ Next step
67+
68+
[Part 5 - Alert service :material-arrow-right-circle:{ align=right }](./alert-service.md)

0 commit comments

Comments
 (0)