In this review-challenge day we are encouraged to go and look up the list of Core Image filters and to try some of them out. Let’s start from the first thing.
We start from here in the Apple Documentation description for the Core Image framework. This is just huge! Trying not to get lost is already a good challenge. From here we move to the Core Image Filter Reference, sadly a no-longer-being-updated document, for a list of all possibly filters divided by category (at a non-precise look they seem to be around 150 filters, but they could be many more…). Clicking into any of them expands the description but it doesn’t specify (directly) which key one should use, unless that is the attribute and we need to manually reference that back to the other documentation article on filter keys (ouff…). Well.. no it is not there…
It seems that the solution is to read all this article called Core Image Programming Guide, simply because I may be able to just add an extra filter but there is no apparent way I can know which key I should use when.
Another article, Processing an Image Using Built-in-Filters, seems to provide more sensible instructions. Here we learn that:
- A
CIFilter
represents a single operation or recipe for a particular effect. - To process a
CIImage
object, pass it throughCIFilter
objects.- It is possible to either subclass
CIFilter
or draw from the existing library of built-in filters.
- It is possible to either subclass
A very sympathetic note is left for us to warn us of our impending doom:
The built-in filters are not separate class types with visible properties. You must know their names and input parameters in advance.
Thank you, Apple!
Some of the more common input parameter types have associated keys, such as
kCIInputImageKey
. If you cannot infer the associated key constant, you can simply use the string literal found in the filter reference.
Again… thank you… in the list there is no way (for me) to associate one with the other…
In the next line things become clearer: in the filters description page let’s select CICategoryColorEffect and then CISepiaTone. In side the expanded space we can see that the parameters are inputImage
(a CIImage
object whose display name is “Image”) and inputIntensity
.
So, when we program out filter we need to create a CIFilter(name: "CISepiaTone")
; call its setValue
method passing as first parameter our image and for key something that sounds like inputImage
which is kCIInputImageKey
; call again the setValue
method this time passing a Double value as the first parameter (between 0.0 and 1.0) and for key something that sounds like inputIntensity
which is going to be kCIInputIntensityKey
.
To summarise: in order for us to survive this we need to have the list of keys always at hand, browse for a filter we want to use and check if we have already prepared its key, otherwise we prepare it.
This starts to get clearer but I confess my head is burning from new informations right now. They say it’s healthy, so let’s move on!
After explaining how to apply two more filters, we get a warning:
To optimise computation, Core Image does not actually render any intermediate
CIImage
result until you force theCIImage
to display its content onscreen, as you might do usingUIImageView
In short, all this power comes at a cost of computing resources so everything needs to be planned carefully.
We are invited to check out Filter Recipes to accomplish tasks such as applying a Chroma Key effect, selectively focusing on an image, customising image transitions and simulating scratchy analog film.
This part looks amazingly interesting and I have left the page open in Safari to look at it after the challenges (unless they need us to cover it!).
So, to do-list:
- Study the Core Image programming guide
- Try out a few new filters, at least one per category
- Study the filter recipes and possibly apply them to the current app
Review
Here is what we learned today, among other things:
- We provide input to Core Image filters using a series of keys and values.
- We can create a
UIImage
from aCGImage
. - Setting no handler closure for a
UIAlertAction
will cause your alert controller to be dismissed. - Core Image filters are created using their names.
- When our user has selected an image inside a
UIImagePickerController
, ourdidFinishPickingMediaWithInfo
method will get called. - Writing to the user’s photo library requires permission.
- Any method used with
#selector
must be marked@objc
. - Xcode can automatically suggest Auto Layout constraints.
- Core Image runs slowly in the simulator
- Each Core Image filter has its own set of keys that it supports.
- Setting
allowsEditing
to true lets users crop the photo they select. This is not entirely true because one can only zoom in the picture to select a more specific region, but the frame is immutable (as far as I know). - We can track changes in the value of a
UISlider
using itsvalueChanged
event.
Challenges
Challenge 1: try making the Save button show an error if there was no image in the image view.
This was an easy one, just update the save()
method to this:
@IBAction func save(_ sender: Any) {
guard let image = imageView.image else {
let noImageAC = UIAlertController(title: "No Image Found!", message: "Please select an image from your Photo Library before continuing!", preferredStyle: .alert)
noImageAC.addAction(UIAlertAction(title: "OK", style: .default, handler: nil))
present(noImageAC, animated: true)
return
}
UIImageWriteToSavedPhotosAlbum(image, self, #selector(image(_:didFinishSavingWithError:contextInfo:)), nil)
}
On to the next one!
Challenge 2: make the Change Filter button change its title to show the name of the currently selected filter.
This is somewhat harder because it seems that we cannot connect the title of the alert action directly to the sender’s title label’s text. Or can we?
As the act of changing the button would happen only when we actually press the action inside the alert’s, that is, when we invoke the handler: setFilter
.
My issue now is as follows: the setFilter
method has one parameter of type UIAlertAction
which doesn’t get considered here when we call it.
I added an extra parameter for the sender
and this made the handler ask for an extra UIAlertAction
that could not be set to nil… Very strange… I really hope I do not have to write an extra closure for every single alert action, also because it would always be the same…
Just for the sake of testing I tried to write a closure that would weakly capture the sender, have an action parameter and then update the text of the title’s label. Interestingly enough the text got changed but, after about 1/10 of a second, it reverted back to “Change Filter”. Like, wut?! I mean, really interesting! It means that my closure works but something else gets called straight after that!
Maybe this happens because I used weak capturing… as the sender gets released after the closure ends it is normal that the title returns to what it was beforehand.
Anyway, I will investigate this further, but I received a hint telling me to use a property observer to the currentFilter
property after creating an outlet for the button. It works very well of course. Here is the new currentFilter
property.
var currentFilter: CIFilter! {
didSet {
let name = currentFilter.name.replacingOccurrences(of: "CI", with: "")
changeFilter.setTitle(name, for: .normal)
}
}
Challenge 3: experiment with having more than one slider, to control each of the input keys you care about. For example, you might have one for radius and one for intensity.
Very briefly because I need to go back to work.
I found this challenge strangely easy, which probably means that something is very amiss… Sure, I didn’t take the time to fine-tune every single detail, I just wanted to have something that worked, nothing more.
So, I performed all of the necessary changed in the storyboard, as quickly as possible, duplicating buttons and sliders, unplugging outlets and replugging them to new properties and then just changed the first parameter of the setValue
call in applyProcessing
to use the appropriate slider.
It all works, just I would like the appropriate filters only to be active when needed, while the others to be disabled.
Wrap up!
I will probably come back to this is time allows.
In the meantime you can find the completed project here.
If you like what I’m doing here please consider liking this article and sharing it with some of your peers. If you are feeling like being really awesome, please consider making a small donation to support my studies and my writing (please appreciate that I am not using advertisement on my articles).
If you are interested in my music engraving and my publications don’t forget visit my Facebook page and the pages where I publish my scores (Gumroad, SheetMusicPlus, ScoreExchange and on Apple Books).
You can also support me by buying Paul Hudson’s books from this Affiliate Link.
Anyways, thank you so much for reading!
Till the next one!