Module Integration Guides
Building a custom client
Pushing data to Bytebeam
4 min
introduction the previous section shown us how to connect to bytebeam it is now time to start pushing some data all communication with bytebeam happens using json on bytebeam data is organised into streams docid\ y7umxyghxnlxyfcevoggc each stream is mapped to an mqtt topic go through the creating a stream docid\ qlmwlqu6zjj6ena bffhw guide to create a stream every project in bytebeam comes with a default stream called device shadow docid yibewj202bfcf4o7ekcv we will be illustring how to push data using this stream mqtt topic and payload structure to push data to device shadow you need to publish data to the mqtt topic /tenants/{tenant id}/devices/{device id}/events/device shadow/jsonarray replace {tenant id} with your project name and {device id} with the device id you can find these in the json file downloaded in the previous section to push data to a different stream replace device shadow in the path above with the name of the stream the payload shoud look like this \[{ "sequence" 1, "timestamp" 1710760059006, "status" "on" }] each packet consists of an array of data points each data point consists of sequence , timestamp and all the other fields that the stream has sequence is an auto incrementing number for each packet timestamp is the timestamp in milliseconds if you have created a custom stream replace status with the fields from that stream field names are case sensitive trying it out with mqttx you can try publishing data using mqttx as shown below notice the qos1 setting you should see the status change on bytebeam now implementing data push in your client follow the below steps for implementing data push first figure out how to serialize data using json json org has a list of libraries you can use programatically construct the json structure as discussed above ensure the timestamp and sequence fields are present and timestamp is in milliseconds publish the data to the topic as show above optionally implement batching in the payload the server accepts an batch of data as a json array if your client is expected to generate a large amount of data it will be more efficient to batch data and send it to the cloud instead of sending a single packet with each message optionally implement compression if you are sending larger batches of data you can save on data size by sending zlib or lz4 compressed data to /tenants/{tenant id}/devices/{device id}/events/device shadow/jsonarray/zlibdeflate or /tenants/{tenant id}/devices/{device id}/events/device shadow/jsonarray/lz4 respectively notice the zlibdeflate and lz4 at the end of the topics optionally implement persistence if you are likely to have frequent periods of network disconnections it is useful to batch data in memory or on disk during periods of no network and then publish this once the connection is re established