CIImage saturation using CIFilters, is it possible?

I am working with CIImage object, trying to get saturation value. But I cannot find the way to get image saturation, at least using CIFilters available in OS X. Is it possible to get saturation of CIImage using standard CIFilters?

Take a look at CIColorControls filter. With this filter it is possible to adjust image's saturation, brightness, and contrast.

To calculate saturation, this filter linearly interpolates between a grayscale image (saturation = 0.0) and the original image (saturation = 1.0). The filter supports extrapolation: For values large than 1.0, it increases saturation.


CIFilter *colorControlsFilter = [CIFilter filterWithName:@"CIColorControls"];
[colorControlsFilter setDefaults];
[colorControlsFilter setValue:sourceImage forKey:@"inputImage"];
[colorControlsFilter setValue:[NSNumber numberWithFloat:1.0] forKey:@"inputSaturation"];
[colorControlsFilter setValue:[NSNumber numberWithFloat:0.2] forKey:@"inputBrightness"];
[colorControlsFilter setValue:[NSNumber numberWithFloat:1.0] forKey:@"inputContrast"];

CIImage *outputImage = [colorControlsFilter valueForKey:@"outputImage"];

CIColorControls reference here. ▸  All CIFilters here.

CIImage, You use CIImage objects in conjunction with other Core Image classes—such as CIFilter This lazy evaluation allows Core Image to operate as efficiently as possible. A CIFilter object cannot be shared safely among threads. with values for improving image quality by altering values for skin tones, saturation, contrast,  The parameters of a CIFilter object are set and retrieved through the use of key-value pairs. You use the CIFilter object in conjunction with other Core Image classes, such as CIImage , CIContext , and CIColor , to take advantage of the built-in Core Image filters when processing images, creating filter generators, or writing custom filters.

You cannot do it using built-in CIFilter but you can code it quite simply. I won't give you full implementation but I can give you the cikernel that'll do the work :

vec4 rgbToHsv(vec4 rgb) {
    float x = min(rgb.r, min(rgb.g, rgb.b));
    float v = max(rgb.r, max(rgb.g, rgb.b));
    float f = (rgb.r == x) ? rgb.g - rgb.b : ((rgb.g == x) ? rgb.b - rgb.r : rgb.r - rgb.g);
    float i = (rgb.r == x) ? 3.0 : ((rgb.g == x) ? 5.0 : 1.0);
    float h = i - (f / (v - x));
    float s = (v - x) / v;
    return (v == x) ? vec4(-1, 0, v, rgb.a) : vec4(h, s, v, rgb.a);

kernel vec4 saturationToGrayscale(sampler image) {
    vec4 color = sample(image, samplerCoord(image));
    vec4 hsv = rgbToHsv(color);
    return premultiply(vec4(vec3(hsv.g), color.a));

rgbToHsv function comes from Andy Finnell's blog

Output result is a grayscale image where white means sat == 1.0 and black means sat == 0.0. You can revert this by using "1.0-hsv.g" instead of "hsv.g" in the return argument. You can also retrieve hue and value by using respectively hsv.r and rsv.b.

Making it a CIFilter is left to you. Take a Look at Apple's Creating Custom Filters to do this. The code above takes place in the .cikernel file.

attributes, Keys listed in Filter Attribute Keys describe the filter, providing information such as a the corresponding value is another dictionary describing possible values for that {CIAttributeClass = CIImage; }; inputSaturation = { CIAttributeClass  A CIImage object has all the information necessary to produce an image, but Core Image doesn’t actually render an image until it is told to do so. This lazy evaluation allows Core Image to operate as efficiently as possible. CIContext and CIImage objects are immutable, which

You can adjust the saturation of an RGB image by converting it to YpCbCr and changing the cb and cr values. Accelerate.vimage can do this conversion for you, and Apple has a sample code project discussing that very topic here.

You can wrap vImage operations in a CIImageProcessorKernel to make them available to Core Image. Again, Apple has an article explaining how to do that here.

Hope that helps,


Advanced Image Processing with Core Image, With over 170 built-in filters that can be used alone or together in co… They're also designed to be as performant as possible, so some, such as the numeric parameters to control the brightness, contrast and the saturation. CIImage. Now Core Image has its own image data type called CIImage . To calculate saturation, this filter linearly interpolates between a grayscale image (saturation = 0.0) and the original image (saturation = 1.0). The filter supports extrapolation: For values large than 1.0, it increases saturation. To calculate contrast, this filter uses the following formula: (color.rgb - vec3(0.5)) * contrast + vec3(0.5)

Working with iOS Image Filters in Swift - Atomic Spin, Here's a walkthrough to get you started with iOS image filters in Swift. convert UIImage to CIImage let inputCIImage = CIImage(image: inputImage)! This post just scratched the surface of what's possible using Core Image. It's possible to get much more interesting results using HSL filtering. For example, we can preserve only the specified range of hue values and desaturate all others out of the range. So, it will lead to a black and white image with only some regions colored.

Programming iOS 13: Dive Deep into Views, View Controllers, and , image's saturation, hue, brightness, contrast, gamma and white point, A CIFilter is a set of instructions for generating a CIImage — the filter's output image. an image into another image, using CIImages and CIFilters as intermediaries. Question: Tag: ios,uiimageview,uiimage,hsv,ciimage I am trying to change a UIImage saturation during an animation. But it seems that it is too heavy/slow to do so this way:

Programming IOS 11: Dive Deep Into Views, View Controllers, and , The “CI” in CIFilter and CIImage stands for Core Image, a technology for Thus you can alter an image's saturation, hue, brightness, contrast, gamma and white  Saturation: Enhance or reduce color intensity Photographers and designers will most commonly adjust an image’s saturation settings to kick up its colors a notch. Saturation has to do with color intensity, so more saturated colors are bolder and brighter (closer to their purest form) while less saturated colors are more faded (closer to gray).

  • Yes, Justin, this way one can change or set desired saturation. But I need to extract saturation from CIImage as a standalone grayscale image.