Stay organized with collections
Save and categorize content based on your preferences.
ModelManager
open class ModelManager
Handles model inference and only support TFLite model inference now. See android.adservices.ondevicepersonalization.IsolatedService#getModelManager
.
Summary
Public methods |
open Unit |
Run a single model inference.
|
Public methods
Content and code samples on this page are subject to the licenses described in the Content License. Java and OpenJDK are trademarks or registered trademarks of Oracle and/or its affiliates.
Last updated 2025-02-10 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-02-10 UTC."],[],[],null,["# ModelManager\n\nAdded in [API level 35](https://developer.android.com/guide/topics/manifest/uses-sdk-element.html#ApiLevels)\n\nModelManager\n============\n\n*** ** * ** ***\n\nKotlin \\|[Java](/reference/android/adservices/ondevicepersonalization/ModelManager \"View this page in Java\") \n\n```\nopen class ModelManager\n```\n\n|---|--------------------------------------------------------------|\n| [kotlin.Any](https://kotlinlang.org/api/latest/jvm/stdlib/kotlin/-any/index.html) ||\n| ↳ | [android.adservices.ondevicepersonalization.ModelManager](#) |\n\nHandles model inference and only support TFLite model inference now. See [android.adservices.ondevicepersonalization.IsolatedService#getModelManager](/reference/kotlin/android/adservices/ondevicepersonalization/IsolatedService#getModelManager(android.adservices.ondevicepersonalization.RequestToken)).\n\nSummary\n-------\n\n| Public methods ||\n|-----------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| open [Unit](https://kotlinlang.org/api/latest/jvm/stdlib/kotlin/-unit/index.html) | [run](#run(android.adservices.ondevicepersonalization.InferenceInput,%20java.util.concurrent.Executor,%20android.os.OutcomeReceiver))`(`input:` `[InferenceInput](/reference/kotlin/android/adservices/ondevicepersonalization/InferenceInput)`, `executor:` `[Executor](../../../java/util/concurrent/Executor.html#)`, `receiver:` `[OutcomeReceiver](../../os/OutcomeReceiver.html#)\u003c[InferenceOutput](/reference/kotlin/android/adservices/ondevicepersonalization/InferenceOutput)!,` `[Exception](../../../java/lang/Exception.html#)!\u003e`)` Run a single model inference. |\n\nPublic methods\n--------------\n\n### run\n\nAdded in [API level 35](https://developer.android.com/guide/topics/manifest/uses-sdk-element.html#ApiLevels) \n\n```\nopen fun run(\n input: InferenceInput, \n executor: Executor, \n receiver: OutcomeReceiver\u003cInferenceOutput!, Exception!\u003e\n): Unit\n```\n\nRun a single model inference. Only supports TFLite model inference now. \nThis method may take several seconds to complete, so it should only be called from a worker thread.\n\n| Parameters ||\n|------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `input` | [InferenceInput](/reference/kotlin/android/adservices/ondevicepersonalization/InferenceInput): contains all the information needed for a run of model inference. This value cannot be `null`. |\n| `executor` | [Executor](../../../java/util/concurrent/Executor.html#): the [Executor](../../../java/util/concurrent/Executor.html#) on which to invoke the callback. This value cannot be `null`. Callback and listener events are dispatched through this [Executor](../../../java/util/concurrent/Executor.html#), providing an easy way to control which thread is used. To dispatch events through the main thread of your application, you can use [Context.getMainExecutor()](../../content/Context.html#getMainExecutor()). Otherwise, provide an [Executor](../../../java/util/concurrent/Executor.html#) that dispatches to an appropriate thread. |\n| `receiver` | [OutcomeReceiver](../../os/OutcomeReceiver.html#)\\\u003c[InferenceOutput](/reference/kotlin/android/adservices/ondevicepersonalization/InferenceOutput)!, [Exception](../../../java/lang/Exception.html#)!\\\u003e: this returns a [InferenceOutput](/reference/kotlin/android/adservices/ondevicepersonalization/InferenceOutput) which contains model inference result or [Exception](../../../java/lang/Exception.html#) on failure. This value cannot be `null`. |"]]