Page updated Jan 16, 2024

Label objects in an image

Amplify iOS v1 is now in Maintenance Mode until May 31st, 2024. This means that we will continue to include updates to ensure compatibility with backend services and security. No new features will be introduced in v1.

Please use the latest version (v2) of Amplify Library for Swift to get started.

If you are currently using v1, follow these instructions to upgrade to v2.

Amplify libraries should be used for all new cloud connected applications. If you are currently using the AWS Mobile SDK for iOS, you can access the documentation here.

The following APIs will enable you identify real world objects (chairs, desks, etc) in images. These objects are referred to as "labels" from images.

For labeling images on iOS we use both AWS backend services as well as Apple's on-device Core ML Vision Framework to provide you with the most accurate results. If your device is offline, we will return results only from Core ML. On the other hand, if you are able to connect to AWS Services, we will return a unioned result from both the service and Core ML. Switching between backend services and Core ML is done automatically without any additional configuration required.

Set up your backend

If you haven't already done so, run amplify init inside your project and then amplify add auth (we recommend selecting the default configuration).

Run amplify add predictions, then use the following answers:

1? Please select from one of the categories below (Use arrow keys)
2❯ Identify
3 Convert
4 Interpret
5 Infer
6 Learn More
7
8? What would you like to identify?
9 Identify Text
10 Identify Entities
11❯ Identify Labels
12
13? Provide a friendly name for your resource
14 <Enter a friendly name here>
15
16? Would you like use the default configuration?
17❯ Default Configuration
18 Advanced Configuration
19
20? Who should have access?
21 Auth users only
22❯ Auth and Guest users

The Advanced Configuration will allow you to select moderation for unsafe content or all of the identified labels. Default uses both.

Run amplify push to create the resources in the cloud

Working with the API

You can identify real world objects such as chairs, desks, etc. which are referred to as “labels” by using the following sample code:

1func detectLabels(_ image:URL) {
2 // For offline calls only to Core ML models replace `options` in the call below with this instance:
3 // let options = PredictionsIdentifyRequest.Options(defaultNetworkPolicy: .offline, pluginOptions: nil)
4 Amplify.Predictions.identify(type: .detectLabels(.labels), image: image) { event in
5 switch event {
6 case let .success(result):
7 let data = result as! IdentifyLabelsResult
8 print(data.labels)
9 // Use the labels in your app as you like or display them
10 case let .failure(error):
11 print(error)
12 }
13 }
14}
15
16// To identify labels with unsafe content
17func detectLabels(_ image:URL) {
18 Amplify.Predictions.identify(type: .detectLabels(.all), image: image) { event in
19 switch event {
20 case let .success(result):
21 let data = result as! IdentifyLabelsResult
22 print(data.labels)
23 // Use the labels in your app as you like or display them
24 case let .failure(error):
25 print(error)
26 }
27 }
28}
1func detectLabels(_ image:URL) -> AnyCancellable {
2 // For offline calls only to Core ML models replace `options` in the call below with this instance:
3 // let options = PredictionsIdentifyRequest.Options(defaultNetworkPolicy: .offline, pluginOptions: nil)
4 Amplify.Predictions.identify(type: .detectLabels(.labels), image: image)
5 .resultPublisher
6 .sink {
7 if case let .failure(error) = $0 {
8 print(error)
9 }
10 }
11 receiveValue: { result in
12 let data = result as! IdentifyLabelsResult
13 print(data.labels)
14 // Use the labels in your app as you like or display them
15 }
16}
17
18// To identify labels with unsafe content
19func detectLabels(_ image:URL) -> AnyCancellable {
20 Amplify.Predictions.identify(type: .detectLabels(.all), image: image)
21 .resultPublisher
22 .sink {
23 if case let .failure(error) = $0 {
24 print(error)
25 }
26 }
27 receiveValue: { result in
28 let data = result as! IdentifyLabelsResult
29 print(data.labels)
30 // Use the labels in your app as you like or display them
31 }
32}