Apple devices are particularly known for their camera features provided. Though Apple devices put a big hole in your pocket, but there are actually many valid reasons whilst the main reason is Apple devices are worth it for camera features.
These days some compact cameras boast as many as 16MP, while you can get more than 24MP on an SRL. Phone cameras are also packing in the megapixels. The Samsung Galaxy S4 offers a 13MP camera, and the Galaxy S5 will offer a 16MP camera. The Sony Xperia Z2 has a 20.7MP camera. But the camera on Apple flagship iPhone – the iPhone 5s – only offers 8MP. Does this mean that Apple’s iPhone camera is worst than other phone cameras that offer more megapixels?
It’s totally not correct to think that the more megapixels your smartphone has the better your image will be. Of course, that doesn’t stop various camera phone and camera manufacturers from promoting their products as high-end superior devices due to the number of megapixels they have packed in. These manufacturers are hoping to fool you into thinking that the number of pixels has something to do with image quality. People like to see a number that indicates whether something is better so many are being taken in by this ruse.
These manufacturers are hoping to fool you into thinking that the number of pixels has something to do with image quality. People like to see a number that indicates whether something is better so many are being taken in by this ruse.
The number of pixels means nothing if the manufacturer has achieved it by cramming them onto a small sensor, as is the case with many modern camera phones. In some cases, in order to achieve more pixels, manufacturers are making those pixels smaller.
Source : – Quora
Apple provides their camera with the right amount of pixels for the size of the sensor.Which actually are worth the number of pixels provided which do peel out the most out of it.
iPhones feature optical image stabilization(OIS). The iPhone 6S has a 12-megapixel rear-facing camera, with a technology Apple calls Focus Pixels.
Optical image stabilization (OIS)
The image is stabilized by varying the optical path of the sensor. This is real time compensation and hence no alteration or image degradation takes place. The lens assembly is moved parallel to the image plane. Shake detecting sensors( Gyro -sensors) are used, which transmit this information to a microcomputer which in turn converts them into the Drive signal which ultimately moves the lens Assembly to correct the image project on the sensor before it is converted to digital format and thus offsets your motion.
Electronic image stabilization (EIS)
Electronic image stabilization works on a completely different principle and in this technique, the problem is solved at programming level after the optical signal has been converted to a digital signal.
In EIS the processor breaks down the image into small chunks and then compares it to the preceding frames. It determines whether the motion was a moving object or an unwanted shake and makes the required correction. Now when you shift the image you need to fill in the shifted area and for that, you need to have a bigger CCD or provide some area in the existing CCD to capture the image for off screen purposes. This might lead to image degradation; however, the effect is fairly less when your camera device has a fairly large resolution.
The problem with higher resolution sensors is that then end up with smaller photo sites, which in turn results in less sensitivity and more noise.
Apple designs their own image signal processor and their own video encoder. These are hardware based on the SoC. So the focus is faster, the low light filtering is more precise, the removal of noise, reproduction of color.
Here are some comparison tests Straight From Google 😛 between Apple iPhones and various other android devices.
Well that’s it guys hope you find this article interesting And stay tuned for more!!
Also check out our Facebook page for geeky memes. 😉