InferenceInput.Builder
public
static
final
class
InferenceInput.Builder
extends Object
java.lang.Object | |
↳ | android.adservices.ondevicepersonalization.InferenceInput.Builder |
A builder for InferenceInput
Summary
Public constructors | |
---|---|
Builder(InferenceInput.Params params, Object[] inputData, InferenceOutput expectedOutputStructure)
Creates a new Builder. |
Public methods | |
---|---|
InferenceInput
|
build()
Builds the instance. |
InferenceInput.Builder
|
setBatchSize(int value)
The number of input examples. |
InferenceInput.Builder
|
setExpectedOutputStructure(InferenceOutput value)
The empty InferenceOutput representing the expected output structure. |
InferenceInput.Builder
|
setInputData(Object... value)
An array of input data. |
InferenceInput.Builder
|
setParams(InferenceInput.Params value)
The configuration that controls runtime interpreter behavior. |
Inherited methods | |
---|---|
Public constructors
Builder
public Builder (InferenceInput.Params params, Object[] inputData, InferenceOutput expectedOutputStructure)
Creates a new Builder.
Parameters | |
---|---|
params |
InferenceInput.Params : The configuration that controls runtime interpreter behavior.
This value cannot be null . |
inputData |
Object : An array of input data. The inputs should be in the same order as inputs
of the model.
For example, if a model takes multiple inputs:
For TFLite, this field is mapped to inputs of runForMultipleInputsOutputs:
https://www.tensorflow.org/lite/api_docs/java/org/tensorflow/lite/InterpreterApi#parameters_9
This value cannot be null . |
expectedOutputStructure |
InferenceOutput : The empty InferenceOutput representing the expected output
structure. For TFLite, the inference code will verify whether this expected output
structure matches model output signature.
If a model produce string tensors:
This value cannot be null . |
Public methods
build
public InferenceInput build ()
Builds the instance.
Returns | |
---|---|
InferenceInput |
This value cannot be null . |
setBatchSize
public InferenceInput.Builder setBatchSize (int value)
The number of input examples. Adopter can set this field to run batching inference. The batch size is 1 by default. The batch size should match the input data size.
Parameters | |
---|---|
value |
int |
Returns | |
---|---|
InferenceInput.Builder |
This value cannot be null . |
setExpectedOutputStructure
public InferenceInput.Builder setExpectedOutputStructure (InferenceOutput value)
The empty InferenceOutput representing the expected output structure. For TFLite, the inference code will verify whether this expected output structure matches model output signature.
If a model produce string tensors:
String[] output = new String[3][2]; // Output tensor shape is [3, 2].
HashMap<Integer, Object> outputs = new HashMap<>();
outputs.put(0, output);
expectedOutputStructure = new InferenceOutput.Builder().setDataOutputs(outputs).build();
Parameters | |
---|---|
value |
InferenceOutput : This value cannot be null . |
Returns | |
---|---|
InferenceInput.Builder |
This value cannot be null . |
setInputData
public InferenceInput.Builder setInputData (Object... value)
An array of input data. The inputs should be in the same order as inputs of the model.
For example, if a model takes multiple inputs:
String[] input0 = {"foo", "bar"}; // string tensor shape is [2].
int[] input1 = new int[]{3, 2, 1}; // int tensor shape is [3].
Object[] inputData = {input0, input1, ...};
For TFLite, this field is mapped to inputs of runForMultipleInputsOutputs:
https://www.tensorflow.org/lite/api_docs/java/org/tensorflow/lite/InterpreterApi#parameters_9
Parameters | |
---|---|
value |
Object : This value cannot be null . |
Returns | |
---|---|
InferenceInput.Builder |
This value cannot be null . |
setParams
public InferenceInput.Builder setParams (InferenceInput.Params value)
The configuration that controls runtime interpreter behavior.
Parameters | |
---|---|
value |
InferenceInput.Params : This value cannot be null . |
Returns | |
---|---|
InferenceInput.Builder |
This value cannot be null . |