@@ -168,9 +168,12 @@ collect data from it.
168
168
--campaign-file campaign-brake-event.json
169
169
```
170
170
171
- - (Optional) To enable S3 upload, append the option ` --data-destination S3 ` . By default the
172
- upload format will be JSON. You can change this to Parquet format for S3 by passing
173
- ` --s3-format PARQUET ` .
171
+ - The default ` --data-destination ` is S3, and the default upload format is JSON. You can change
172
+ this to Parquet format for S3 by passing ` --s3-format PARQUET ` .
173
+ - (Optional) To enable Amazon Timestream as destination, add the flag
174
+ ` --data-destination TIMESTREAM ` . ** Note** : Amazon Timestream for Live Analytics is only
175
+ available to customers who are already onboarded in that region. See
176
+ [ the availability change documentation] ( https://docs.aws.amazon.com/timestream/latest/developerguide/AmazonTimestreamForLiveAnalytics-availability-change.html ) .
174
177
- (Optional) To enable IoT topic as destination, add the flag ` --data-destination IOT_TOPIC ` . To
175
178
define the custom IoT topic use the flag ` --iot-topic <TOPIC_NAME> ` . Note: The IoT topic data
176
179
destination is a "gated" feature of AWS IoT FleetWise for which you will need to request
@@ -188,8 +191,8 @@ collect data from it.
188
191
The demo script:
189
192
190
193
1 . Registers your AWS account with AWS IoT FleetWise, if not already registered.
191
- 1 . Creates an Amazon Timestream database and table.
192
- 1 . Creates IAM role and policy required for the service to write data to Amazon Timestream .
194
+ 1 . Creates an S3 bucket with a bucket policy that allows AWS IoT FleetWise to write data to the
195
+ bucket .
193
196
1 . Creates a signal catalog based on ` can-nodes.json ` .
194
197
1 . Creates a model manifest that references the signal catalog with all of the CAN signals.
195
198
1 . Activates the model manifest.
@@ -203,19 +206,25 @@ collect data from it.
203
206
1 . Creates a campaign from ` campaign-brake-event.json ` that contains a condition-based collection
204
207
scheme to capture the engine torque and the brake pressure when the brake pressure is above
205
208
7000, and targets the campaign at the fleet.
209
+ 1 . The data uploaded to S3 would be in JSON format, or Parquet format if the
210
+ ` --s3-format PARQUET ` option is passed.
206
211
1 . Approves the campaign.
207
212
1 . Waits until the campaign status is ` HEALTHY ` , which means the campaign has been deployed to
208
213
the fleet.
209
- 1 . Waits 30 seconds and then downloads the collected data from Amazon Timestream .
214
+ 1 . Wait 20 minutes for the data to propagate to S3 and then download it .
210
215
1 . Saves the data to an HTML file.
211
216
212
- If S3 upload is enabled, the demo script will instead:
217
+ If ` TIMESTREAM ` upload is enabled (** Note** : Amazon Timestream for Live Analytics is only
218
+ available to customers who have already been onboarded in that region. See
219
+ [ the availability change documentation] ( https://docs.aws.amazon.com/timestream/latest/developerguide/AmazonTimestreamForLiveAnalytics-availability-change.html ) ),
220
+ the demo script will instead:
213
221
214
- 1 . Create an S3 bucket with a bucket policy that allows AWS IoT FleetWise to write data to the
215
- bucket.
216
- 1 . Creates a campaign from ` campaign-brake-event.json ` to upload the data to S3 in JSON format,
217
- or Parquet format if the ` --s3-format PARQUET ` option is passed.
218
- 1 . Wait 20 minutes for the data to propagate to S3 and then download it.
222
+ 1 . Creates an Amazon Timestream database and table.
223
+ 1 . Creates IAM role and policy required for the service to write data to Amazon Timestream.
224
+ 1 . Creates a campaign from ` campaign-brake-event.json ` that contains a condition-based collection
225
+ scheme to capture the engine torque and the brake pressure when the brake pressure is above
226
+ 7000, and targets the campaign at the fleet.
227
+ 1 . Waits 30 seconds and then downloads the collected data from Amazon Timestream.
219
228
1 . Save the data to an HTML file.
220
229
221
230
This script will not delete Amazon Timestream or S3 resources.
@@ -229,8 +238,11 @@ collect data from it.
229
238
simulated brake pressure signal. As you can see that when hard braking events occur (value above
230
239
7000), collection is triggered and the engine torque signal data is collected.
231
240
232
- Alternatively, if your AWS account is enrolled with Amazon QuickSight or Amazon Managed Grafana,
233
- you may use them to browse the data from Amazon Timestream directly.
241
+ Alternatively, if your upload destination was set to ` TIMESTREAM ` and AWS account is enrolled
242
+ with Amazon QuickSight or Amazon Managed Grafana, you may use them to browse the data from Amazon
243
+ Timestream directly. ** Note** : Amazon Timestream for Live Analytics is only available to
244
+ customers who have already been onboarded in that region. See
245
+ [ the availability change documentation] ( https://docs.aws.amazon.com/timestream/latest/developerguide/AmazonTimestreamForLiveAnalytics-availability-change.html ) .
234
246
235
247
![ ] ( ./images/collected_data_plot.png )
236
248
@@ -452,9 +464,12 @@ collect data from it.
452
464
--campaign-file campaign-brake-event.json
453
465
```
454
466
455
- - (Optional) To enable S3 upload, append the option ` --data-destination S3 ` . By default the
456
- upload format will be JSON. You can change this to Parquet format by passing
457
- ` --s3-format PARQUET ` .
467
+ - The default ` --data-destination ` is S3, and the default upload format is JSON. You can change
468
+ this to Parquet format for S3 by passing ` --s3-format PARQUET ` .
469
+ - (Optional) To enable Amazon Timestream as destination, add the flag
470
+ ` --data-destination TIMESTREAM ` . ** Note** : Amazon Timestream for Live Analytics is only
471
+ available to customers who are already onboarded in that region. See
472
+ [ the availability change documentation] ( https://docs.aws.amazon.com/timestream/latest/developerguide/AmazonTimestreamForLiveAnalytics-availability-change.html ) .
458
473
- (Optional) To enable IoT topic as destination, add the flag ` --data-destination IOT_TOPIC ` To
459
474
define the custom IoT topic use the flag ` --iot-topic <TOPIC_NAME> ` . Note: The IoT topic data
460
475
destination is a "gated" feature of AWS IoT FleetWise for which you will need to request
@@ -468,8 +483,8 @@ collect data from it.
468
483
The demo script:
469
484
470
485
1 . Registers your AWS account with AWS IoT FleetWise, if not already registered.
471
- 1 . Creates an Amazon Timestream database and table.
472
- 1 . Creates IAM role and policy required for the service to write data to Amazon Timestream .
486
+ 1 . Creates an S3 bucket with a bucket policy that allows AWS IoT FleetWise to write data to the
487
+ bucket .
473
488
1 . Creates a signal catalog based on ` can-nodes.json ` .
474
489
1 . Creates a model manifest that references the signal catalog with all of the CAN signals.
475
490
1 . Activates the model manifest.
@@ -483,22 +498,27 @@ collect data from it.
483
498
1 . Creates a campaign from ` campaign-brake-event.json ` that contains a condition-based collection
484
499
scheme to capture the engine torque and the brake pressure when the brake pressure is above
485
500
7000, and targets the campaign at the fleet.
501
+ 1 . The data uploaded to S3 would be in JSON format, or Parquet format if the
502
+ ` --s3-format PARQUET ` option is passed.
486
503
1 . Approves the campaign.
487
504
1 . Waits until the campaign status is ` HEALTHY ` , which means the campaign has been deployed to
488
505
the fleet.
489
- 1 . Waits 30 seconds and then downloads the collected data from Amazon Timestream .
506
+ 1 . Wait 20 minutes for the data to propagate to S3 and then download it .
490
507
1 . Saves the data to an HTML file.
491
508
492
- If S3 upload is enabled, the demo script will additionally :
509
+ If ` TIMESTREAM ` upload is enabled, the demo script will instead :
493
510
494
- 1 . Create an S3 bucket with a bucket policy that allows AWS IoT FleetWise to write data to the
495
- bucket.
496
- 1 . Creates an additional campaign from ` campaign-brake-event.json ` to upload the data to S3 in
497
- JSON format, or Parquet format if the ` --s3-format PARQUET ` option is passed.
498
- 1 . Wait 20 minutes for the data to propagate to S3 and then download it.
499
- 1 . Save the data to an HTML file.
511
+ ** Note** : Amazon Timestream for Live Analytics is only available to customers who have already
512
+ been onboarded in that region. See
513
+ [ the availability change documentation] ( https://docs.aws.amazon.com/timestream/latest/developerguide/AmazonTimestreamForLiveAnalytics-availability-change.html ) .
500
514
501
- This script will not delete Amazon Timestream or S3 resources.
515
+ 1 . Creates an Amazon Timestream database and table.
516
+ 1 . Creates IAM role and policy required for the service to write data to Amazon Timestream.
517
+ 1 . Creates a campaign from ` campaign-brake-event.json ` that contains a condition-based collection
518
+ scheme to capture the engine torque and the brake pressure when the brake pressure is above
519
+ 7000, and targets the campaign at the fleet.
520
+ 1 . Waits 30 seconds and then downloads the collected data from Amazon Timestream.
521
+ 1 . Save the data to an HTML file. This script will not delete Amazon Timestream or S3 resources.
502
522
503
523
1 . When the script completes, a path to an HTML file is given. _ On your local machine_ , use ` scp ` to
504
524
download it, then open it in your web browser:
@@ -511,8 +531,11 @@ collect data from it.
511
531
simulated brake pressure signal. As you can see that when hard braking events occur (value above
512
532
7000), collection is triggered and the engine torque signal data is collected.
513
533
514
- Alternatively, if your AWS account is enrolled with Amazon QuickSight or Amazon Managed Grafana,
515
- you may use them to browse the data from Amazon Timestream directly.
534
+ Alternatively, if your upload destination was set to ` TIMESTREAM ` and AWS account is enrolled
535
+ with Amazon QuickSight or Amazon Managed Grafana, you may use them to browse the data from Amazon
536
+ Timestream directly. ** Note** : Amazon Timestream for Live Analytics is only available to
537
+ customers who have already been onboarded in that region. See
538
+ [ the availability change documentation] ( https://docs.aws.amazon.com/timestream/latest/developerguide/AmazonTimestreamForLiveAnalytics-availability-change.html ) .
516
539
517
540
![ ] ( ./images/collected_data_plot.png )
518
541
0 commit comments