Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
498 views
in Technique[技术] by (71.8m points)

iphone - How can I map Photoshop's level adjustment to a Core Image filter?

I'm mapping several photoshop elements to CIFilter, the only one I'm having trouble with is this Levels Adjustment:

image

screen shot from current version of photoshop

Which CI Filter (or combination of filters) would let me utilize the 16, 1.73, 239 & 39/245 above in the first example or the 31, 1.25, 255 30/255 in the second example. I believe this is a kind of shadow/black and white level adjustment.

Any help appreciated.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

By adapting the formula from this page: http://http.developer.nvidia.com/GPUGems/gpugems_ch22.html, I believe you can do this using a combination of CIColorMatrix, CIGammaAdjust and another CIColorMatrix.

Let's call the input levels inBlack, inGamma and inWhite respectively, and the output levels outBlack and outWhite. Note that Photoshop color are between 0 and 255 while CI colors are between 0 and 1 so you need to divide the Photoshop values (except inGamma!) by 255 before putting them into the following formulas.

The input mapping is pixel = (inPixel-inBlack)/(inWhite-inBlack), which means your first matrix will be

red = [1/(inWhite-inBlack) 0 0 0]
green = [0 1/(inWhite-inBlack) 0 0]
blue = [0 0 1/(inWhite-inBlack) 0]
alpha = [0 0 0 1]
bias = [-inBlack/(inWhite-inBlack), -inBlack/(inWhite-inBlack),-inBlack/(inWhite-inBlack), 0]

Then you apply gamma correction using CIGammaAdjust and the inGamma number (I had to use the inverse 1/inGamma when doing my calculations, try that too!).

Finally the output mapping is pixel = gammaCorrectedPixel * (outWhite - outBlack) + outBlack, giving you the final matrix

red = [(outWhite - outBlack) 0 0 0]
green = [0 (outWhite - outBlack) 0 0]
blue = [0 0 (outWhite - outBlack) 0]
alpha = [0 0 0 1]
bias = [outBlack outBlack outBlack 0]

I haven't actually tried this using CoreImage, but the calculations work out nicely!


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...