Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Apple’s iPhone 16 Pro Gets a Little Smarter With Apple Intelligence

The iPhone 16 has been one of the weirder smartphone launches I’ve seen from Apple in the nine years I’ve been covering tech. Much of the product’s new capabilities are wrapped around Apple Intelligence—a suite of smart features powered by artificial intelligence—which isn’t available at launch. Apple will roll out a software update in October to deliver these promised perks.
But my entire testing period with the iPhone 16 Pro and iPhone 16 Pro Max has been on a developer beta of iOS 18.1, which has many (not all) of these Apple Intelligence features already running. I feel like I have a good handle on what the experience will be like for iPhone owners in October, though this early version of the software also had a few bugs. While doing some performance testing, for example, Resident Evil Village crashed a few times no matter whether I had maxed out the graphics settings or not. Apple couldn’t replicate the issue but suggested I roll back to the stable version of iOS 18 to fix the problem.
Broadly, there are some nice improvements with the iPhone 16 Pro and iPhone 16 Pro Max. They deliver some of the best performance on a phone, have solid battery life, and have excellent camera systems. When you throw in Apple Intelligence, there are moments when the new capabilities are helpful day to day, but there is no must-have software perk (yet) that makes Apple’s AI features leaps ahead of the smartphone pack, nor worthy of an upgrade alone.
In some ways, Apple Intelligence makes it feel like the company is playing catch-up. Take smart replies as an example. This is one of the new “AI” features in Apple’s Mail client and the Messages app. If someone messages you, you’ll get a prompt above the keyboard to send a reply generated based on the context of the conversation.
It might not be a surprise for my recipients, but I have been liberally using smart replies in Gmail and Google Messages on Android for years. I am very thankful for the seconds I’ve saved not having to type out, “Thank you!” or “Sounds good.” Now, iPhone owners can take advantage. However, I have only seen these smart replies in the Messages app (likely another beta bug).
Siri is a big part of the Apple Intelligence package. Now, when you activate the voice assistant, there’s a lovely glow around the entire screen. You can even keep using your phone after triggering Siri so asking a question doesn’t have to interrupt what you’re doing. If you make a mistake in your question or change your mind mid-sentence, Siri can still decipher what you’re asking and answer.
Siri has deep knowledge of the iPhone and its settings, so you can ask it questions like, “How do I update apps?” and it’ll dish out the instructions. Hilariously, I asked how to type to Siri—one of its new capabilities so you can use Siri without speaking to it—and the voice assistant did not understand my question. (It’s a double tap on the bottom edge of the screen.) This is all a much-needed upgrade for Siri. I mean, you’ve been able to type to Google Assistant and Alexa for years. (Technically, you’ve been able to type to Siri, but this was an accessibility setting you had to manually enable, and it wasn’t well advertised.)
Whether Siri’s answers are more useful is still a toss-up. I like that I can speak to it more naturally, and the ability to type to it is easily a win, but I’ve asked it a handful of queries where it didn’t know the answer and presented me with Google Search results. In the future, Siri will have a ChatGPT integration, drawing on the power of large language models to be more helpful with these kinds of queries (very much like Gemini on Android). It will also be able to connect with other apps, like Mail and Messages, so you can ask personal questions, like when your flight is landing.
There are a handful of “Writing Tools” available throughout the operating system, all powered by machine intelligence. This is something competitors like Google and Samsung have also really been pushing. When you select text on the iPhone screen, you’ll see “Writing Tools” in the overlay menu. Tap it, and you get a bunch of options: proofread, rewrite, summarize, and more. You can change the tone of an email you’re writing to sound more professional or casual. You can proofread your text for more than just grammar. Or you can copy large blocks of text from a PDF and summarize them with a tap in an email.
Remembering to use these features will take some time because it’s not immediately obvious they’re present. How useful they will be depends a lot on your workflow. I rarely need to summarize blocks of text, so I haven’t found much use for it. The “professional” tone it uses when you ask it to rewrite text is a little too professional for me. The proofreading tool is a bit more handy, but I wish this feature was quicker to access.
Creating summaries seems to be a thing everyone wants to do with AI, and Apple Intelligence is ready to do the same. You can have your emails summarized, messages summarized, and even your notifications from third-party apps summarized. Some of this can be handy, like when the Mail app calls out an urgent-sounding email in its summary, which I would have missed had I just glanced at the giant collection of emails. But more often than not I just swipe away the summary and dive into all the notifications.
Speaking of, there’s a summarize feature built into Safari, but you have to put the web page into Reader mode. It’s these kinds of things that make it hard to find these smart features and remember that they exist. At the very least, I was able to summarize an 11,000-word story and get the gist of it when I didn’t have time to sit down and read it. (Sorry.) I’ll forgive you if you summarize this review.
Arguably the most helpful Apple Intelligence features for me as a journalist who attends multiple briefings a month are the new transcription tools in the Notes, Voice Memos app, and even in the Phone app. Hit record in Voice Memos and Notes and the apps will transcribe conversations in real time! If you’re on a phone call, tap the record button and after both parties are notified, it will start recording the call, and you’ll get a transcription saved to your Notes app.
For all of these, much depends on the microphone quality for the person on the other end. Either way, it’s certainly better than no transcription at all. It’s too bad there are no speaker labels, like on Google’s Recorder app. You also can’t search these recordings to find a specific quote. (Technically, you can if you add the transcript to your note in the Notes app, but it’s an extra step.)
The Photos app is getting an Apple Intelligence infusion too, and the highlight here is the Clean Up feature. Just like with Google’s Pixel phones that debuted Magic Eraser more than three years ago, you can now delete unwanted objects in the background of your iPhone photos. This works pretty well in my experience, though I’m a little surprised Apple gives you so much freedom to erase anything. I completely erased my eye from existence in a selfie. I erased all my fingers off my hand. (Google’s feature doesn’t let you erase parts of a person’s face.)
Next, I erased my mug, which was in front of my face as I went for a sip, and Clean Up tried to generate the rest of my face that was previously hidden to some horrifying results. (For what it’s worth, I tried this on the Pixel 9 and the results were just as bad, though Google did give me more options.) As my coworker said in Slack, “They both seem to have been trained on images of Bugs Bunny.”
There’s more to come in Apple Intelligence. Image Playground will let you generate images. Genmoji will let you create new kinds of emoji that right now only exist in your mind. Siri will be able to better serve more contextually relevant information. But I’ll have to do another dive into Apple Intelligence when those features arrive later this year. Just a reminder that Apple Intelligence is a part of the next iOS 18 update, but it’s only available on select devices: the iPhone 15 Pro, 15 Pro Max, and the entire iPhone 16 range.
Also, the key part of Apple Intelligence is that the company is deploying these features in a way that is arguably more private and secure than anything that has come before, though security researchers are still evaluating the efficacy of Apple’s Private Cloud Compute technology. (You can read more about this proprietary process here.) So if it seems like Apple is late to the game on some of these features, that is the company’s M.O., taking its time in delivering a better and usually more secure experience.
Oh right, there’s a new phone too! I’m not going to go over all the specs and such—I have a lot of that information here—but let’s touch on what the new changes are like. First, I would like to ask Apple to please start giving its Pro phones more fun colors. It’s silly that the iPhone 16 Pro is relegated to more neutral color palettes like Desert, and there’s no way to get it in pink or ultramarine like the iPhone 16 and iPhone 16 Plus. (You can always try a fancy case!)
The Pro models are bigger than before. I initially saw these as minor screen size bumps stemming from the slimmer bezels around the screen, but it’s more than that. They are slightly taller than previous iPhones. This isn’t so much of a problem with the iPhone 16 Pro, but the iPhone 16 Pro Max was already big to begin with. Now, it’s even harder to reach the top of the phone with a stretched thumb. I don’t think a physical size bump was necessary.
I want to say you should stick with the smaller iPhone 16 Pro, but then you’re faced with choosing between a more comfortable experience versus longer battery life. The Pro Max easily beats the iPhone 16 Pro in run time, often hitting more than six hours of screen-on time with more than 30 percent left in the tank at the end of the day. You can probably take it into the morning of day two. On the iPhone 16 Pro, I was able to get more than seven hours of screen-on time, but that firmly put the battery at 15 percent after 12 hours (8:30 am to 8:30 pm). You’re more likely to need a top-up before bedtime if you’re a heavy user. (It’s too bad these phones don’t have the easier-to-replace battery as the iPhone 16.)
At least you can wirelessly charge these phones faster—about 50 percent in 30 minutes with a MagSafe Charger paired with a 30-watt charger. (Side note: The 30-watt iPhone charger Apple sells separately is still comically large to what you can get on the market.)
These phones are powered by the A18 Pro chipset, which is far and away the most powerful smartphone processor according to my benchmark tests, easily blowing away much of the Android competition. It has one extra graphics core compared to the A18 in the iPhone 16, but the CPU cache sizes are larger, which means as a whole, the A18 Pro is still generally faster than the A18.
I have not really complained about performance on a high-end phone in years, but a lot of this power comes into play with AI tasks as well as graphics-intensive games. My main attempt to stress-test the phones was by playing Resident Evil Village and Assassin’s Creed Mirage with the Backbone One controller. I maxed out graphics on both, and the former looked great (Mirage could use more polish, Ubisoft), but I did notice some stutters, and both games ended up crashing after some time. The phones did get hot, but nothing out of the ordinary.
This might be because I’m on a developer beta, but I did not have time to erase the phone to move to a stable version and retest. For what it’s worth, iPhones crashing while playing these AAA games doesn’t seem to be that out of the ordinary. The more I played, the more I reaffirmed that I would rather buy these games on a console or PC platform to enjoy the experience on a much bigger TV or computer screen.
As for AI tasks, the only feature that seemed to take its time was Clean Up in the Photos app. It was fairly quick to erase items in an image, but reverting things to normal took more time than I expected. (We’re still talking seconds here.) Outside of these demanding tasks, I’ve not seen anything to make me worry about the phone’s processing power.
It’s the camera experience that really makes me swoon over these iPhones. It all starts with the brand-new button: Camera Control. You might find yourself accidentally pressing this button all the time and launching the camera; thankfully, you can set it so that only a double-press will activate it. I like having it as a camera launch button because now it frees up the Action Button to trigger a different function.
However, I do not particularly care for the sliding gestures on the Camera Control button. A light double-press (where you don’t actually push the button down), will trigger an overlay menu that lets you slide your finger left or right on the button to cycle through camera modes. A light single press selects one, and you can slide through things like zoom levels or different Photographic Styles (Apple’s color-grading filters). I find it much faster to make the change on the screen rather than sliding my finger precariously on this button. I may get more used to this after a while, but right now, I just like having a dedicated shutter button. (Pro tip: It’s also much better for taking selfies with the rear cameras.)
Broadly speaking, the iPhone 16 Pro camera system (now identical on both Pro models), is pretty great. Whether you snap an image with the new 48-megapixel ultrawide or the 48-megapixel Fusion camera, you’re going to be pretty happy with the results. My favorite is always the 5X optical zoom camera, so I can get clear shots of subjects further away. There are moments when I prefer photos from the Google Pixel 9 Pro (see: panoramas and ultrawide images), and other times when the iPhone pulls ahead. It’s an excellent camera system at the top of the game.
Apple has zhuzhed up its Photographic Styles, which are not like filters on Instagram. These can much more granularly adjust the colors in an image, whereas a filter just rests on top of your photo. You can shoot in a specific style (I love the “Gold” one), or change styles after the fact. There’s a lot more customization here and what stands out is Apple’s method of letting you change your skin tone.
Unlike Google’s Real Tone image processing technique trained on various skin tones to then produce more accurate results, Apple lets you adjust your skin tone to various other “Undertones.” I understand the reasoning here because sometimes you may want your skin tone to look a particular way in specific lighting. That said, in an apples-to-Apples comparison, I did find the Pixel 9 Pro’s reproduction of my skin tone more accurate right off the bat, but the iPhone gave me more tools to tweak it to my liking. I’m not sure how many times I’d be hopping into the photo’s editing settings to change it though.
There are two other standout features in the camera system. First is the ability to shoot video at 4K and 120 frames per second. This is exclusive to the iPhone 16 Pro models and essentially lets you capture more frames at a higher resolution, which then gives you smoother slow-motion footage. It works great and I liked it so much I made a tiny short movie out of a day’s excursion.
Then there’s the “studio-quality mics” on the Pro models. The mics to my ears don’t sound significantly better than those on the iPhone 16, except when there’s a lot of background noise. That’s when it does a better job of isolating your (or your subject’s voice). But the best feature of all is Audio Mix, which isn’t exclusive to the Pro iPhones. This lets you change how your video’s audio is outputted and you can choose between three modes: In-frame, Studio, and Cinematic. In-frame focuses on the people in the frame, Studio simulates a studio-like audio setup, and Cinematic focuses on the people in the center of the frame and throws in some ambient noise.
I took some video by the Manhattan Bridge the other night with the subway riding nearby. It was very loud, there was some wind, background chatter, and the default audio from my video was just OK. I could hear myself fine but there was a lot of noise. Swapping to “Studio” was pretty great. It did a remarkable job of stripping away a lot of that noise and focusing on my voice, though I do sound a little more processed. I compared it to Google’s Audio Magic Eraser, and while it did a similar job, I had to play around with the settings a little more to get to a certain level of quality.
Overall, I had a fairly enjoyable time with the Pro iPhones this year. I’ve switched over to the iPhone 16 and iPhone 16 Plus next for testing, but one thing I want to note so far is that the differences aren’t that great between the Pro iPhones and the non-Pro models this time around. The biggest one to me is the lack of a 120-Hz screen refresh rate on the iPhone 16, but outside of this, it’s a very similar experience, especially since most people aren’t utilizing the “Pro” features like recording video in ProRes codec. Dare I say the iPhone 16 also just looks better?
The Apple Intelligence capabilities are promising, even if some of it feels like Apple playing catch-up. The marketing sure feels overhyped. But the rest of the iPhone 16 Pro’s hardware is polished and powerful too. (More buttons!) I only wish Apple would copy more of Google’s call screening features because one thing’s for sure: since I switched to these iPhones from the Pixel 9 series I recently reviewed, the spam calls will not stop.

en_USEnglish