class Aws::CleanRoomsML::Types::GetTrainedModelInferenceJobResponse


@see docs.aws.amazon.com/goto/WebAPI/cleanroomsml-2023-09-06/GetTrainedModelInferenceJobResponse AWS API Documentation
@return [Hash<String,String>]
of aws do not count against your tags per resource limit.
count against the limit of 50 tags. Tags with only the key prefix
not, then Clean Rooms ML considers it to be a user tag and will
this prefix. If a tag value has aws as its prefix but the key does
cannot edit or delete tag keys with this prefix. Values can have
such as a prefix for keys as it is reserved for AWS use. You
* Do not use aws:, AWS:, or any upper or lowercase combination of
* Tag keys and values are case sensitive.
characters: + - = . _ : / @.
numbers, and spaces representable in UTF-8, and the following
allowed characters. Generally allowed characters are: letters,
resources, remember that other services may have restrictions on
* If your tagging schema is used across multiple services and
* Maximum value length - 256 Unicode characters in UTF-8.
* Maximum key length - 128 Unicode characters in UTF-8.
can have only one value.
* For each resource, each tag key must be unique, and each tag key
* Maximum number of tags per resource - 50.
The following basic restrictions apply to tags:
optional value, both of which you define.
categorize and organize them. Each tag consists of a key and an
The optional metadata that you applied to the resource to help you
@!attribute [rw] tags
@return [String]
Details about the logs status for the trained model inference job.
@!attribute [rw] logs_status_details
@return [String]
The logs status for the trained model inference job.
@!attribute [rw] logs_status
@return [String]
job.
Details about the metrics status for the trained model inference
@!attribute [rw] metrics_status_details
@return [String]
The metrics status for the trained model inference job.
@!attribute [rw] metrics_status
@return [String]
associated data.
encrypt and decrypt customer-owned data in the ML inference job and
The Amazon Resource Name (ARN) of the KMS key. This key is used to
@!attribute [rw] kms_key_arn
@return [Hash<String,String>]
The environment variables to set in the Docker container.
@!attribute [rw] environment
@return [String]
Information about the training container image.
@!attribute [rw] inference_container_image_digest
@return [String]
The description of the trained model inference job.
@!attribute [rw] description
@return [Types::StatusDetails]
Details about the status of a resource.
@!attribute [rw] status_details
@return [Types::InferenceContainerExecutionParameters]
The execution parameters for the model inference job container.
@!attribute [rw] container_execution_parameters
@return [Types::ModelInferenceDataSource]
The data source that was used for the trained model inference job.
@!attribute [rw] data_source
@return [String]
inference job.
The membership ID of the membership that contains the trained model
@!attribute [rw] membership_identifier
@return [Types::InferenceOutputConfiguration]
job.
The output configuration information for the trained model inference
@!attribute [rw] output_configuration
@return [Types::InferenceResourceConfig]
inference job.
The resource configuration information for the trained model
@!attribute [rw] resource_config
@return [String]
was used to generate the inference results.
job. This identifies the specific version of the trained model that
The version identifier of the trained model used for this inference
@!attribute [rw] trained_model_version_identifier
@return [String]
for the trained model inference job.
The Amazon Resource Name (ARN) for the trained model that was used
@!attribute [rw] trained_model_arn
@return [String]
The status of the trained model inference job.
@!attribute [rw] status
@return [String]
The name of the trained model inference job.
@!attribute [rw] name
@return [String]
association that was used for the trained model inference job.
The Amazon Resource Name (ARN) of the configured model algorithm
@!attribute [rw] configured_model_algorithm_association_arn
@return [String]
The Amazon Resource Name (ARN) of the trained model inference job.
@!attribute [rw] trained_model_inference_job_arn
@return [Time]
updated.
The most recent time at which the trained model inference job was
@!attribute [rw] update_time
@return [Time]
The time at which the trained model inference job was created.
@!attribute [rw] create_time