class Aws::CleanRoomsML::Types::StartTrainedModelInferenceJobRequest
@see docs.aws.amazon.com/goto/WebAPI/cleanroomsml-2023-09-06/StartTrainedModelInferenceJobRequest AWS API Documentation
@return [Hash<String,String>]
of aws do not count against your tags per resource limit.
count against the limit of 50 tags. Tags with only the key prefix
not, then Clean Rooms ML considers it to be a user tag and will
this prefix. If a tag value has aws as its prefix but the key does
cannot edit or delete tag keys with this prefix. Values can have
such as a prefix for keys as it is reserved for AWS use. You
* Do not use aws:, AWS:, or any upper or lowercase combination of
* Tag keys and values are case sensitive.
characters: + - = . _ : / @.
numbers, and spaces representable in UTF-8, and the following
allowed characters. Generally allowed characters are: letters,
resources, remember that other services may have restrictions on
* If your tagging schema is used across multiple services and
* Maximum value length - 256 Unicode characters in UTF-8.
* Maximum key length - 128 Unicode characters in UTF-8.
can have only one value.
* For each resource, each tag key must be unique, and each tag key
* Maximum number of tags per resource - 50.
The following basic restrictions apply to tags:
optional value, both of which you define.
categorize and organize them. Each tag consists of a key and an
The optional metadata that you apply to the resource to help you
@!attribute [rw] tags
@return [String]
associated data.
encrypt and decrypt customer-owned data in the ML inference job and
The Amazon Resource Name (ARN) of the KMS key. This key is used to
@!attribute [rw] kms_key_arn
@return [Hash<String,String>]
The environment variables to set in the Docker container.
@!attribute [rw] environment
@return [Types::InferenceContainerExecutionParameters]
The execution parameters for the container.
@!attribute [rw] container_execution_parameters
@return [String]
The description of the trained model inference job.
@!attribute [rw] description
@return [Types::ModelInferenceDataSource]
job.
Defines the data source that is used for the trained model inference
@!attribute [rw] data_source
@return [Types::InferenceOutputConfiguration]
inference job.
Defines the output configuration information for the trained model
@!attribute [rw] output_configuration
@return [Types::InferenceResourceConfig]
job.
Defines the resource configuration for the trained model inference
@!attribute [rw] resource_config
@return [String]
association that is used for this trained model inference job.
The Amazon Resource Name (ARN) of the configured model algorithm
@!attribute [rw] configured_model_algorithm_association_arn
@return [String]
generate predictions on the input data.
This specifies which version of the trained model should be used to
The version identifier of the trained model to use for inference.
@!attribute [rw] trained_model_version_identifier
@return [String]
this trained model inference job.
The Amazon Resource Name (ARN) of the trained model that is used for
@!attribute [rw] trained_model_arn
@return [String]
The name of the trained model inference job.
@!attribute [rw] name
@return [String]
inference job.
The membership ID of the membership that contains the trained model
@!attribute [rw] membership_identifier