100 Days of Swift – Day 82

Reflections on Day 81

Yesterday I didn’t post any article as I simply followed the Swift on Sundays stream where we build an ARKit app to track pictures of scientists and show informations about them.

It was a great stream and a very enjoyable experience but, for some reason, my version of the app (GitHub here) doesn’t show the text next to the picture but just the flags.

I suspected a bad import of the JSON but it is not that, it would have given me a fatal error, I suspected a wrong code but I just copy-pasted again Paul’s code and … nothing … it doesn’t work…

Downloading his app works so I really have no idea what all this is about.

Once more, the most frustrating thing is asking for help and being completely ignored. My inner self would like to get angry about this but, sincerely, at this point, who cares?

I really feel pedagogy should be revolutionised so that people who really get invested in learning but for some reason get stuck can get help in some way. Again, saying “feel free to ask for help” is very nice but is just plain marketing, nothing more, same for the “all profits go to charity”. This is a new fashion of getting income without paying any taxes and then giving only what remained after your expenses are covered. I mean, don’t get me wrong, no one should ever work for free but just do not sell it like that. I know, 95% of people are dumb enough to fall into that trap but I just can’t stand it!

Stop talking, let’s move on to today’s program.

Hacking with Swift — learning Project 22

I am pretty behind in progress: while the initiative is at Day 87 I am now facing Day 75, but fine, I wanted to take the needed time to understand topics. Also, after this project on building the Notes app clone I really feel something has clicked.

Today we are looking at micro-location, the technology that’s able to detect very small distances between things.

First, on a device different from the one you will be using for testing install the app Locate Beacons and be sure to have a device that runs iOS 7 or later for testing this app.

Second, create a new Single View app called “Detect-a-Beacon”.

Requesting location: Core Location

Inside the Info.plist files we are going to insert the request for location access: add the “Privacy – Location Always and When In Use Usage Description” and the “Privacy – Location When In Use Usage Description”, both with a string value of “We want to help you find your nearest store”.

This will cause the alert controller to pop up when we launch the app and ask both for using our location at any time and when you are using the app.

Now switch to the Main.storyboard file and add a label to the view controller, centred both vertically and horizontally with the appropriate constraints and set its text to be “UNKNOWN” in capital letters and with a System Thin 40.0 font.

Switch on the Assistant Editor and create an outlet from the label to the view controller class called distanceReading.

Above import UIKit add an import to the CoreLocation framework. Quoting from Apple’s Documentation we know this:

Core Location provides services for determining a device’s geographic location, altitude, orientation, or position relative to a nearby iBeacon. The framework uses all available onboard hardware, including Wi-Fi, GPS, Bluetooth, magnetometer, barometer, and cellular hardware to gather data.

The first time that your app requests authorization, its authorization status is indeterminate and the system prompts the user to grant or deny the request. The system records the user’s response and does not display this panel upon subsequent requests.

After requesting permission and determining whether services are available, you start most services using the CLLocationManager object and receive the results in your associated delegate object.

This Core Location manager object is the next thing we are going to implement in the app in the form of an optional CLLocationManager object called locationManager. This is defined as the object that we use to start and stop the delivery of location-related events to your app. The reading of this article is advised if one wants a deeper understanding of the subject.

Inside viewDidLoad we instantiate the location manager, set the view controller to be its delegate (remember to conform to CLLocationManagerDelegate) and we call the requestAlwaysAuthorization() method on it. The Documentation recommends not to ask for always authorisation if not really needed but, for the purpose of this app it is good enough.

Before moving out also change the view’s background color too grey.

Before the end of our class lets implement the didChangeAuthorization method, which tells the delegate (us) that the authorisation status for the application changed.

Inside it we check if the status parameter is equal to .authorizedAlways, the fourth case of the CLAuthorizationStatus enumeration (link here) then we check if the monitoring for the Core Location manager is available for the Core Location Beacon Region (which returns a Boolean value indicating whether the device supports region monitoring using the specified class — in our case CLLocationManager)1. Finally we also check if the ranging is available for the core location manager, which checks whether the device supports ranging of Bluetooth beacons.

Hunting the beacon: CLBeaconRegion

To get our app finished we need to implement three new methods.

First, the startScanning() one, which declares a UUID from a precise string and a CLBeaconRegion from that uuid with a major parameter of 123, a minor parameter of 456 and an identifier of “MyBeacon”. This code initialises and returns a region object that targets a beacon with the specified proximity ID and those two major and minor value that are used to identify beacons. The major parameter value is used to identify one or more beacons (something like a set of beacons) while the minor one is used to identify a specific one. After we will ask the location manager to start monitoring and start detecting the range of any available beacon in the just declared region.

Second, the update(distance: CLProximity) will perform an animation with a duration of 1 second and with a switch on the distance parameter, which will be a case of the CLProximity enumeration. In case of a .far distance (about 4-5 meters in my experience) we will change the background colour to blue and the text to “FAR”. For a .near case (about 2-4 meters), the colour will be orange and the text “NEAR”, which for the .immediate case (which means “very near”) the colour will be red and the text “RIGHT HERE”. We will add a default case with a .gray background and an “UNKNOWN” message.

Third, we will summon the didRangeBeacons method of locationManager, which tells the delegate that one or more beacons are in range. Inside it we will check that there is a beacon by picking the first one from the beacons parameter array and call the update(distance: beacon.proximity) method on it, otherwise we will call it with the .unknown case.

To finish, just call the startScanning method inside the didChangeAuthorization method where we wrote // do stuff.

And that’s it. The project is done and it works.

Go to the Locate Beacons app and insert the major and minor numbers in the sensible beacon (you will recognise it thanks to the UUID) and tap on Start Advertising. Then, just have fun!

Your can find the code for this project here.


Please don’t forget to drop a hello and a thank you to Paul for all his great work (you can find him on Twitter) and be sure to visit the 100 Days Of Swift initiative page. We are learning so much thanks to him and he deserves to know our gratitude.

He has about 20 great books on Swift, all of which you can check about here.

The 100 Days of Swift initiative is based on the Hacking with Swift book, which you should definitely check out.

  1. It is not clear to me why we write CLBeaconRegion.self. What does that mean?

If you like what I’m doing here please consider liking this article and sharing it with some of your peers. If you are feeling like being really awesome, please consider making a small donation to support my studies and my writing (please appreciate that I am not using advertisement on my articles).

If you are interested in my music engraving and my publications don’t forget visit my Facebook page and the pages where I publish my scores (Gumroad, SheetMusicPlus, ScoreExchange and on Apple Books).

You can also support me by buying Paul Hudson’s books from this Affiliate Link.

Anyways, thank you so much for reading!

Till the next one!

Published by Michele Galvagno

Professional Musical Scores Designer and Engraver Graduated Classical Musician (cello) and Teacher Tech Enthusiast and Apprentice iOS / macOS Developer Grafico di Partiture Musicali Professionista Musicista classico diplomato (violoncello) ed insegnante Appassionato di tecnologia ed apprendista Sviluppatore iOS / macOS

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: