Apple has re-synced its watches. It has done so without launching any new device, without presenting a more powerful chip or an unexpected product that makes headlines. But it has completely redesigned the experience of its operating systems, from the iPhone to the Mac, including the iPad, Apple Watch, Apple TV, and Vision Pro. And with a very clear idea: to unify, simplify, and modernize the relationship we have with their screens.
The new visual language, dubbed Liquid Glass, marks the biggest redesign in more than a decade. Surfaces are now more translucent, elements float more smoothly, edges are rounded, and transitions glide with a fluidity that is inevitably reminiscent of visionOS, the Vision Pro system. It is no coincidence. Apple wants all of its products to speak the same language, visually and functionally.
A shared aesthetic, a common experience
From iOS 26 to macOS 26 (dubbed “Tahoe”), iPadOS, watchOS, and tvOS, the aesthetics have been rewritten to approach a more spatial, more immersive logic. Interfaces are no longer just functional: they also breathe, change with the environment, and adapt to the content. The iPhone lock screen becomes dynamic. The Camera, Music, and Safari apps refresh their designs. Apple Watch shows floating watch faces. Even CarPlay gets a dose of the new graphical language.
This commitment to cohesion has a practical effect. For those who use multiple Apple devices, the learning curve is reduced. Jumping between screens is more intuitive. Everything seems to be part of the same system, even if it is used in different contexts. And that sense of continuity is something Apple has sought for years but is now reinforcing with tangible results.
Along with the aesthetic changes, there are also new features in terms of functionality. iOS 26, for example, incorporates improvements in Messages—such as polls and smart filters—and a revamped Phone app with real-time transcription and simultaneous translation that works even on calls with users who don’t use Apple devices. These are details that fine-tune the daily experience and consolidate useful functions without the need to reinvent anything, something that the Californian firm has taken to the extreme in this iteration.

The intelligence that doesn’t need the cloud
As we expected, one of the most relevant announcements of WWDC 2025 has less to do with what you see and more with what happens inside. Apple reinforces its commitment to Apple Intelligence, an artificial intelligence architecture integrated into the system that does not need to rely on the cloud. It is not an app nor a separate assistant. It is an architecture that adds context, autonomy, and local response to different functions of the system in a discreet but strategic way.
This approach has clear advantages. By running on the device, AI models are faster, consume fewer resources, and better protect privacy. Visual Intelligence, for example, allows you to detect objects or pieces of text on the screen and act on them directly: copy, translate, search for information, or share. All without leaving the app and without sending data outside the terminal.
The openness to developers is also significant. Apple will provide tools for third-party apps to take advantage of these capabilities. This is an opportunity to enrich the functions of already known apps without having to resort to external services. Intelligence, therefore, is not just another app: it is a layer of the operating system that others can use.
This bet, however, also has limits. Relying almost exclusively on on-device processing, Apple is forced to prioritize lighter, tighter models. Some advanced features may require recent hardware, and others may fall short of the performance and versatility offered by platforms based entirely on the cloud (ChatGPT, Gemini, Grok…). It is the cost of guaranteed privacy and an ecosystem where control is as important as the user experience. In Cupertino they seem to be very clear that their AI will be different.
Apple Intelligence is also more present on other devices. In macOS Tahoe, shortcuts become more powerful and can be triggered by specific conditions, such as location, time of day, or app usage. In Vision Pro, AI facilitates gaze control and improves the representation of avatars during collaborative calls. Even the Apple TV and Apple Watch integrate subtle improvements, which adjust the interface or adapt functions according to use.
Not everything we hoped for has arrived yet. Siri, Apple’s voice assistant, has not received the deep redesign that some rumors announced. The company has announced that the new version will arrive in 2026, after an adjustment process that prioritizes the quality of responses and contextual integration. For now, the assistant is still working, but it falls far behind what we expect from Apple.

The commitment to continuity
With this WWDC, Apple has not sought spectacular headlines or promises that are difficult to execute in practice. He has preferred to show a solid evolution, focused on experience and consistency. The design is more coherent, its AI advances without losing its characteristic philosophy, and the system is more open to developers without losing control over privacy and stability.
For many users, the changes will be subtle but significant. You don’t have to learn anything new, but everything works better. Actions are simplified. The interaction becomes more fluid. And although there is not a great “wow” effect, there is meticulous work behind it that reinforces the feeling of a careful product.
The rollout will begin this fall, with public versions of iOS 26, macOS Tahoe, iPadOS 26, watchOS 26, tvOS 26, and visionOS 2. Developer betas are available now. And if one thing has become clear, it is that Apple has decided to build on what it already has, not break it to start over. If anyone is not forced to take risks, it is Californians.
It is not a revolution, but it is an important adjustment, of course. One that unifies, modernizes, and prepares the ecosystem for what’s to come. Because, although Apple does not say it out loud, its future does not lie only in AI or hardware. It goes through how these pieces are connected, how they are integrated, and how they make everything work by asking very little of the end user.
Apple has confirmed that some of the more advanced features of Apple Intelligence will require A17 or M-series chips, so not all current devices will be compatible with the full suite of new features. And for developers and managers of corporate apps, WWDC also leaves a warning: new APIs, interface redesigns, and contextual capabilities open up opportunities, but they also require adaptation.
Be a part of over Success!
- Stay ahead of the curve with the latest tech trends, gadgets, and innovations! 🚀🔗Newsletter
- Follow me on Medium for more insights ⭐