
If you’re a longtime mobile phone enthusiast like I am, you’ve already seen a dedicated camera control button on phones well before Apple entered the smartphone market. The Nokia N95 is a notable example that comes to mind. Of course, there was Sony Ericsson with its own models. And let’s not forget — though many have — the Windows Phone. But that was the past, and the camera control button is the present.
Apple isn’t first, but it’s making the “camera button” more powerful. The “camera control” lets you use the phone like a regular camera, but it does much more. The iPhone camera becomes the “Third Eye” concept I have been talking about for a long time. I planned to write a book about it, but life got in the way.
As per Wikipedia, according to ancient Hindu Vedic texts, “the third eye (also called the mind’s eye, or inner eye) is a mystical concept that is about an invisible eye that provides perception beyond ordinary sight.” In our modern world, the phone and its camera has become a virtual third eye, providing us a radically new context for shaping everything from art and self to reality itself.… a culmination of a lot of my thinking and writing around photography, rise of computational photography, visual computing and how it leads us to the idea of an augmented reality — with what we think of camera as a portal into our digital interactions.

The iPhone transforms into a visual data capture and interaction tool with the Camera Control button. This concept, championed by Google through its Lens feature, has seen unexpectedly low adoption. It is a safe bet, that more handset makers will add a new “camera control-like” button and put it to work — Google’s got the technology.
The camera’s true potential in this area remains largely untapped. In 2017, when Snap initiated its IPO, I wrote a piece for The New Yorker discussing the camera’s pivotal role in the company’s operations.
The personal devices of the past decade have already made the camera more central to our lives than ever before; it has evolved into a multipurpose tool, a visual sensor, as useful for recording a lunch receipt as for capturing a dazzling landscape.
Snap was at one point in talks with Google to introduce a feature that would have allowed Snapchatters to perform Internet searches merely by pointing their phones at objects in the real world. That search feature never came to fruition, but it’s a useful indicator of where the mobile Internet is headed.
Seven years later, Apple is delivering on that idea. Later this year, Camera Control will transform into a gateway for visual intelligence. Open the Camera app, look at an object, and receive details. When viewed through the app, a restaurant can display information like hours and ratings. Recognize a dog or any other item. If something has a date, you can add it to the calendar. This is “visual AI” without being labeled “AI.”
In that piece, I wrote:
The possibilities that come with thinking about the camera as a portal into the realm of information and services are attractive not only to Snap but also to every other big player in the tech world. Facebook, for instance, has slowly been enhancing the visual capabilities of its Messenger. If you show Google Assistant a pair of Nike Air Maxes, it can not only identify the shoes but also bring up related styles and direct you to a place where they are sold. Tim Cook, the C.E.O. of Apple, is high on A.R., too. At a technology conference last fall, Cook predicted that in the near future “a significant portion of the population of developed countries, and eventually all countries, will have A.R. experiences every day, almost like eating three meals a day.

Facebook has already released eyewear that outperforms many of the functions Apple presented as “visual intelligence” in its keynote. Google also offers similar capabilities. Apple, however, could make it easy for a big swathe of everyday users to adopt these new techniques. While we all wait for an augmented reality big bang, we’re going to experience augmented reality in tiny bits in our daily lives.


Don’t get me wrong — I’m excited about the fact that you can use your iPhone a little more like a plain old camera. Like any camera, it works in both landscape and portrait modes. Thanks to its two sensors — force and touch — it can support many gestures and actions, such as swiping. For instance, the high-precision force sensor enables light press gestures, while the capacitive sensor allows touch interactions.
What seems like a toy button is much more. With it, Apple is finally making the camera the new interaction layer. The company is putting the camera in control, and this button will be a gateway to third-party tools. It could also change how we think about searching, buying, or asking for help with algebra problems. The question is: Can Apple convince developers that a camera’s capabilities extend beyond conventional photography?
September 11, 2024. San Francisco
What Camera Control can do for photographers.
- Quick Launch: Rapidly open the camera app (or any other third-party photo app).
- One-Click Photo Capture: Easily Take a Photo With a Single Click
- Video Recording: To start video recording, click and hold.
- Tactile Switch: Provides Satisfying Click Experience
- Camera Preview: The camera preview feature allows users frame shots before capturing.
- Adjustable Controls: Slide finger to modify zoom, exposure or depth of field
- Light press will open up controls like zoom.
- Double-tapping opens a camera preview menu, allowing users to adjust exposure or depth of field.
- Two-Stage Shutter (coming fall):
- Press lightly to lock focus and exposure.
- Allows reframing without losing focus.
- Third-Party App Integration:
- Developers can incorporate camera control functionality. For example, the Kino app will allow photographers to adjust white balance and set focus points.
- Depth Control: Set focus at various levels of depth in a scene.