1.2.3. Edge Node

This node is able to send data serialized as Inference Data Type and it receives an Inference as Inference Solution Data Type.

1.2.3.1. Synchronous

This node kind requires active interaction with the user to perform its action. Once the data is sent, the thread must wait for the inference to arrive before sending another data. Users can use method request_inference to send new data. The thread calling this method will wait until the whole process has finished and the Inference has arrived from the Inference Node in charge of this data. By destroying the node every internal entity is correctly destroyed.

1.2.3.1.1. Steps

  • Instantiate the Edge Node creating an object of this class with a name.

  • Create a new InferenceDataType from an array of bytes.

  • Send a data synchronously and wait for the inference by calling request_inference.

// Create a new Edge Node
auto node = eprosima::amlip::EdgeNode("My_Edge_Node");

// Create new data to be executed remotely
auto data = eprosima::amlip::InferenceDataType("Some data as byte array serialized from a string or bytes");

// Send data to a remote Inference Node and waits for the inference
// This could be called with an id as well, and it will return the server id that send the inference
auto solution = node.request_inference(data);

1.2.3.2. Asynchronous

Users can use method request_inference to send new data. Due to being asynchronous, multiple requests can be sent without waiting for the previous one to finish. The solution will be sent back to the user through the listener. By destroying the node every internal entity is correctly destroyed.

1.2.3.2.1. Steps

  • Instantiate the Asynchronous Edge Node creating an object of this class with a name, a listener or callback and a domain.

  • Create a new InferenceDataType from an array of bytes.

  • Send a data asynchronously calling request_inference.

  • Wait for the inference.

# Inference listener.
# with each Inference message that is received
# from node and must return the solution to the inference.
def inference_received(
        inference,
        task_id,
        server_id):
    print(f'Data received from server: {server_id}\n'
          f' with id: {task_id}\n'
          f' inference: {inference.to_string()}')

def main():
    # Create a new Async Edge Node
    node = AsyncEdgeNode(
        "My_Async_Edge_Node",
        listener=InferenceListenerLambda(inference_received),
        domain=DOMAIN_ID)

    # Create new data to be executed remotely
    data = InferenceDataType("Some data as byte array serialized from a string or bytes")

    # Send data to a remote Inference Node and waits for the inference
    task_id = node.request_inference(data)

    # User must wait to receive solution.
    # Out of scope, the node will be destroyed,
    # and thus the solution will not arrive.