1.4. Distributed Inference Scenario

This Scenario involves the action of distributing a large amount of Data to remote nodes to perform their inferences, in order to parallelize them and do not block any other important actions that may require to run in the same device. By performing this action, the system ensures seamless execution of multiple tasks, optimizing overall performance and resource utilization. It uses the MultiService over DDS to efficiently publish and distribute the data across remote nodes, ensuring a streamlined and effective process.

The inference is performed in a Inference Node and sent to an Edge Node.

../../../_images/distributed_inference_scenario.png

1.4.1. Inference Data Type

The Inference Data Type represents a partial data-set. Internally, the data sent from an Edge Node to an Inference Node are treated as byte arrays of arbitrary size. So far, the interaction with this class could be done from a void*, a byte array or a string. From Python API, the way to interact with it is by str and bytes type.

1.4.2. Inference Solution Data Type

The Inference Solution Data Type represents the inference of the data sent by the Edge Node. The Inference Solution sent from an Inference Node to an Edge Node is treated as a bytes array of arbitrary size. So far, the interaction with this class could be done from a void*, a byte array or a string. From Python API, the way to interact with it is by str and bytes type.

Note

There is no real data type here, the data format inside is whatever the user wants it to be.