Projects
Physical Computing

Physical Computing

One of the characteristics of a Web3 solution that it is autonomous, which means that data is not entered by humans but is captured and sent to the system using sensors and microcontrollers (known as Physical Computing solutions). This ensure that data is authentic, untampered, and not subject to human error.

Data captured and sent will be stored in the Smart Contract on the private Blockchain and in a NoSQL database.

Microcontroller Circuits

There will be three microcontroller circuits as listed below, each one placed at a different location with sensors to captured and transmit the required data from that location.

Microcontroller TypeParticipant/LocationSensor (Parameter)
LoadingSeller/Loading Dock- ESP32 CAM (Number Plate Image)
- Ultrasonic (Departure Time)
VehicleTransporter/On Vehicle- Load/Weight (Loaded Quantity)
- Temperature (Temperature)
- Humidity (Humidity)
UnloadingBuyer/Unloading Dock- ESP32 CAM (Number Plate Image)
- Ultrasonic (Arrival Time)
- Load/Weight (Unloaded Quantity)

Data Format

Microcontrollers will publish the data to the IoT Platform (an MQTT Broker or an API Gateway) using the MQTT protocol or APIs, in the format below. This is also the format which is expected by the processing function and the format in which it is stored in the OperationalData table in the database. Each microcontroller and the sensors attached to it will have unique ids.

{
  "microcontroller_id": 2,
  "timestamp": 173456789, // In seconds
  "sensor_data": [
    {
      "sensor_id": 21,
      "parameters": [
        {
          "name": "weight",
          "value": 99
        }
      ]
    }
  ]
}

For the purposes of the demo all three microcircuits will have WiFi capability. In a real-world deployment the Vehicle microcontroller will need an alternative mobile data connectivity such as GPRS.

Triggers and Sequence of Data Transmission

The triggers for capturing data and sequence in which the microcontrollers will publish data is as follows:

1. Loading ANPR:
The Ultrasonic sensor on the Loading microcontroller will detect the arrival of the vehicle when the vehicle is within a defined proximity/distance. This will trigger the Camera to take a picture and upload it to the ANPR API for recognition. If the ANPR API returns a failure (due to an unclear picture) the camera will take another picture and resend until the API returns a success response. The green LED will turn on when the ANPR response is a success.

2. Loaded Quantity
The Load/Weight sensor on the On Vehicle microcontroller will publish the Loaded Quantity value as soon as it detects some weight. This assumes the entire load will be placed in one go which may not exactly mimic a real-world process but will be sufficient for the demo. A more appropriate trigger may be determined and implemented in the next version of the solution.

3. Transport Conditions
The Temperature and Humidity sensors on the On Vehicle microcontroller will start publishing the Transport Conditions data at a predefined frequency as soon as the loaded quantity has been sent.

4. Departure Time
The Ultrasonic sensor on the Loading microcontroller will store a flag that the vehicle has arrived. When the vehicle moves out of the defined proximity/distance the microcontroller will publish that value as the Departure time.

5. Arrival Time
The Ultrasonic sensor on the Unloading microcontroller will detect the arrival of the vehicle when the vehicle is within a defined proximity/distance and publish that value as the Arrival time.

6. Unloading ANPR
The detection of the arriving vehicle will also trigger the Camera to take a picture and upload it to the ANPR API for recognition. If the ANPR API returns a failure (due to an unclear picture) the camera will take another picture and resend until the API returns a success response. The green LED will turn on when the ANPR response is a success.

7. Unloaded Quantity
The Load/Weight sensor on the Unloading microcontroller will publish the Unloaded Quantity value as soon as it detects some weight. This assumes the entire load will be placed in one go which may not exactly mimic a real-world process but will be sufficient for the demo. A more appropriate trigger may be determined and implemented in the next version of the solution.