I have extended a SurfaceView for displaying the camera feed for a very simple camera application. To find the optimal preview size for each device, I used this sample code which is used in almost all the open source camera apps I have seen:
List<Camera.Size> sizes = parameters.getSupportedPreviewSizes();
double minDiff = Double.MAX_VALUE;
for (Camera.Size size : sizes) {
if (Math.abs(size.width - width) < minDiff) {
screenWidth = size.width;
screenHeight = size.height;
minDiff = Math.abs(size.width - width);
}
}
Everything works perfectly up to this point.
Now, due to the nature of the application, I have to keep two bitmaps in the memory during the course of a session and for the sake of simplicity(avoiding memory issues during testing) I used the same code for the PICTURE SIZE(replaced the getSupportedPreviewSizes() with getSupportedPictureSizes()). Everything works great on most of the devices though I have to decide some other way to choose the optimum picture size for each device.
Recently, while testing on an Nexus 4 device, this above loop failed in choosing the optimum picture size. Upon investigation, I found out that the getSupportedPictureSizes() functions returns a value i.e 1280*960 which is not actually supported by the Nexus 4 camera. So, how does one solve this issue? I mean, isn't this function supposed to ONLY return those values which the CAMERA of the device supports? I am sure there will be other devices with the same issue which I won't be able to test on. Any clues as to how this issue should be resolved?
UPDATE:
Whats happening is that it accepts the wrong parameter without any error and the image that it returns is distorted, I will try to get a picture here as well. Also, there are no runtime exceptions.
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…