Package com.viam.service.mlmodel.v1
Class MLModelServiceGrpc.MLModelServiceStub
java.lang.Object
io.grpc.stub.AbstractStub<S>
io.grpc.stub.AbstractAsyncStub<MLModelServiceGrpc.MLModelServiceStub>
com.viam.service.mlmodel.v1.MLModelServiceGrpc.MLModelServiceStub
- Enclosing class:
- MLModelServiceGrpc
public static final class MLModelServiceGrpc.MLModelServiceStub
extends io.grpc.stub.AbstractAsyncStub<MLModelServiceGrpc.MLModelServiceStub>
A stub to allow clients to do asynchronous rpc calls to service MLModelService.
MLModelService declares the gRPC contract for a service that takes in a map of input arrays/tensors, runs them through an ML inference engine, and outputs a map of array/tensors.
-
Nested Class Summary
Nested classes/interfaces inherited from class io.grpc.stub.AbstractStub
io.grpc.stub.AbstractStub.StubFactory<T extends io.grpc.stub.AbstractStub<T>>
-
Method Summary
Modifier and TypeMethodDescriptionprotected MLModelServiceGrpc.MLModelServiceStub
build
(io.grpc.Channel channel, io.grpc.CallOptions callOptions) void
infer
(Mlmodel.InferRequest request, io.grpc.stub.StreamObserver<Mlmodel.InferResponse> responseObserver) Infer takes an already ordered input tensor as a map, makes an inference on the model, and returns an output data map.void
metadata
(Mlmodel.MetadataRequest request, io.grpc.stub.StreamObserver<Mlmodel.MetadataResponse> responseObserver) Metadata returns the metadata associated with the ML model.Methods inherited from class io.grpc.stub.AbstractAsyncStub
newStub, newStub
Methods inherited from class io.grpc.stub.AbstractStub
getCallOptions, getChannel, withCallCredentials, withChannel, withCompression, withDeadline, withDeadlineAfter, withExecutor, withInterceptors, withMaxInboundMessageSize, withMaxOutboundMessageSize, withOnReadyThreshold, withOption, withWaitForReady
-
Method Details
-
build
protected MLModelServiceGrpc.MLModelServiceStub build(io.grpc.Channel channel, io.grpc.CallOptions callOptions) - Specified by:
build
in classio.grpc.stub.AbstractStub<MLModelServiceGrpc.MLModelServiceStub>
-
infer
public void infer(Mlmodel.InferRequest request, io.grpc.stub.StreamObserver<Mlmodel.InferResponse> responseObserver) Infer takes an already ordered input tensor as a map, makes an inference on the model, and returns an output data map.
-
metadata
public void metadata(Mlmodel.MetadataRequest request, io.grpc.stub.StreamObserver<Mlmodel.MetadataResponse> responseObserver) Metadata returns the metadata associated with the ML model.
-