You are currently viewing Reducing Memory Footprint When Using UIImage

Reducing Memory Footprint When Using UIImage

Recently my team and I are working on an image processing app for a client. The app works pretty well on most of the aspects. Unfortunately, when we start loading HD images on screen, the app will crash occasionally due to “out of memory exception”.

In this article, I would like to share with you the problem we are facing, what is the root cause of the high memory usage and how we manage to reduce the app’s memory footprint using a simple fix recommended by the Apple engineers.


The Problem

As mentioned earlier, our app’s memory usage spikes up pretty high when we start loading HD images on screen. To showcase the problem we are facing, I have created a sample app that loads an HD image onto the image view when a button is tapped.

@IBAction func loadImage(_ sender: Any) {
    imageView.image = UIImage(named: "lady.jpg")
}
Load HD image into image view
Load an HD image to image view

Note that I am using an image with a dimension of 3648px × 5472px and its file size is 2.4MB.

Here’s the memory report we get from Xcode:

High memory footprint when load HD image on screen in iOS
Memory footprint when load HD image on screen

As you can see, the memory usage spikes up from 8MB to 87MB after the image is loaded.

This doesn’t make sense at all! We are loading a 2.4MB image file, how does the memory footprint get so high? Where does all this memory usage come from? 😫


Memory Usage ≠ File Size

One approach we take to tackle this problem is to scale down the image and draw it on screen based on the image view’s frame size, unfortunately, that doesn’t seem to fix our problem.

Without having any clues on how to fix the problem, I head over to the web. After a few Google searches, I ended up with an amazing WWDC video that introduced a very important concept that we all didn’t know about:

Memory use is related to the dimensions of the image, not the file size.

Session 416, WWDC 2018

Yes I know, the video is from 2018, but it’s never too late to learn, right? 😉

According to the video, in order to show an image on screen, iOS will first need to decode and decompress the image. Typically, 1 pixel of a decoded image will take up 4 bytes of memory — 1 byte for red, 1 byte for green, 1 byte for blue, and 1 byte for the alpha component.

Takes our sample image with dimension 3648px × 5472px as an example:

(3648 * 5472) * 4 bytes ≈ 80MB 

which does add up to what we saw in the memory report.


Downsampling Come to Rescue

Cool! We have figured out where the memory usage comes from, but how can we reduce the memory footprint to an acceptable level?

As I mentioned earlier, we try to scale down and redraw the image, but that doesn’t seem to help. This is because UIImage is expensive for sizing and resizing. During the resizing process, iOS will still decode and decompress the original image, thus causing an unwanted memory spike.

Downsampling Using ImageIO

Luckily for us, the WWDC video does provide a great solution to the problem that we are facing. We can use the ImageIO framework to resize an image before showing it on screen. What is so great about the ImageIO approach is that we can now resize an image by only paying the cost of the resized image. We call this approach “downsampling“.

The following code snippet demonstrate how to perform downsampling on an image:

func downsample(imageAt imageURL: URL,
                to pointSize: CGSize,
                scale: CGFloat = UIScreen.main.scale) -> UIImage? {

    // Create an CGImageSource that represent an image
    let imageSourceOptions = [kCGImageSourceShouldCache: false] as CFDictionary
    guard let imageSource = CGImageSourceCreateWithURL(imageURL as CFURL, imageSourceOptions) else {
        return nil
    }
    
    // Calculate the desired dimension
    let maxDimensionInPixels = max(pointSize.width, pointSize.height) * scale
    
    // Perform downsampling
    let downsampleOptions = [
        kCGImageSourceCreateThumbnailFromImageAlways: true,
        kCGImageSourceShouldCacheImmediately: true,
        kCGImageSourceCreateThumbnailWithTransform: true,
        kCGImageSourceThumbnailMaxPixelSize: maxDimensionInPixels
    ] as CFDictionary
    guard let downsampledImage = CGImageSourceCreateThumbnailAtIndex(imageSource, 0, downsampleOptions) else {
        return nil
    }
    
    // Return the downsampled image as UIImage
    return UIImage(cgImage: downsampledImage)
}

Let us first go through the function’s parameters:

  • imageURL: The image URL. It can be a web URL or a local image path.
  • pointSize: The desired size of the downsampled image. Usually, this will be the UIImageView‘s frame size.
  • scale: The downsampling scale factor. Usually, this will be the scale factor associated with the screen (we usually refer to it as @2x or @3x). That’s why you can see that its default value has been set to UIScreen.main.scale.

The Option Flags

Next, I would like to draw your attention to 3 of the option flags being used in the function — kCGImageSourceShouldCache, kCGImageSourceShouldCacheImmediately and kCGImageSourceCreateThumbnailWithTransform.

First, let’s talk about kCGImageSourceShouldCache. When this flag is set to false, we let the core graphic framework know that we only need to create a reference to the image source and do not want to decode the image immediately when the CGImageSource object is being created.

Pro tip:

In the situation where you do not have access to the path of the image source, you can create a CGImageSource object using the CGImageSourceCreateWithData() initializer.

Next on the list is kCGImageSourceShouldCacheImmediately. This flag indicates that the core graphic framework should decode the image at the exact moment we start the downsampling process.

Therefore, by using both kCGImageSourceShouldCache and kCGImageSourceShouldCacheImmediately option flags, we can have full control on when we want to take the CPU hit for image decoding.

Lastly is the kCGImageSourceCreateThumbnailWithTransform option flag. Setting this flag to true is very important as it lets the core graphic framework know that you want the downsampled image to have the same orientation as the original image.

Let’s See The Result

Now let’s try to load our sample image on screen one more time, but this time with downsampling enabled.

let filePath = Bundle.main.url(forResource: "lady", withExtension: "jpg")!
let downsampledLadyImage = downsample(imageAt: filePath, to: imageView.bounds.size)
imageView.image = downsampledLadyImage

Here’s the memory report:

Reduce 85% memory usage when load HD image on screen in iOS
Memory footprint after image downsampling

By downsampling the image, we manage to reduce the memory footprint from 87MB to 11MB! Pretty impressive isn’t it?

But that’s not all!

Let’s take a look at the side-by-side comparison between the original image and the downsampled image. You can barely notice the differences between them.

Image with and without downsampling result comparison
Comparing downsampling result with original image

The Caveat

You have seen the result of downsampling, and it is pretty amazing. But that doesn’t mean that you should start bundling HD images into your app and downsample them when needed.

Keep in mind that downsampling is a process that takes up CPU power. Therefore, it is still preferable to use a properly scaled image source rather than downsampling an HD image.

In other words, you should consider using downsampling only when you need to display an image where its dimension is much higher than the required on-screen dimension.


Wrapping Up

There you have it! This is how we manage to reduce almost 85% of our app’s memory footprint.

Our app now uses less memory, gets more responsive, and consumes less power. Other apps running on the same device will have more memory to work on, and most importantly our users are now happy users.


If you enjoy reading this article, feel free to check out my other iOS development related articles. You can also follow me on Twitter, and subscribe to my monthly newsletter.

Thanks for reading. 👨🏻‍💻


References


👋🏻 Hey!

While you’re still here, why not check out some of my favorite Mac tools on Setapp? They will definitely help improve your day-to-day productivity. Additionally, doing so will also help support my work.

  • Bartender: Superpower your menu bar and take full control over your menu bar items.
  • CleanShot X: The best screen capture app I’ve ever used.
  • PixelSnap: Measure on-screen elements with ease and precision.
  • iStat Menus: Track CPU, GPU, sensors, and more, all in one convenient tool.