Home Apple Unveils Major Software Upgrades and AI Features at WWDC 2025

Apple Unveils Major Software Upgrades and AI Features at WWDC 2025

Apple Unveils Major Software Upgrades and AI Features at WWDC 2025
Image Courtesy: Pexels

At its annual Worldwide Developers Conference (WWDC) held on June 9 in Cupertino, California, Apple introduced a wave of updates across its software ecosystem. The event showcased fresh interface designs, a new software naming approach, and expanded features in Apple Intelligence, with developers now gaining broader access to the company’s foundational AI technologies.

A New Look: “Liquid Glass” Interface

One of the most noticeable changes comes with the introduction of a new design aesthetic called “Liquid Glass.” Drawing visual inspiration from the Vision Pro’s visionOS, this style offers translucent, reflective UI elements that respond dynamically to lighting and motion.

Elements such as buttons, sliders, navigation bars, and sidebars are being redesigned under this new visual framework. Apple also announced updated APIs to allow developers to begin customizing their apps in line with this look ahead of its full deployment later in 2025.

Fresh Naming for iOS Versions

Breaking from tradition, Apple is retiring its sequential iOS numbering system. Rather than calling the upcoming update iOS 19, the company will now adopt a year-based naming format—similar to naming conventions used by carmakers—so the next major release will be iOS 26.

Alongside the new name, the OS includes notable UI and functionality enhancements. The Phone app gains smart call screening capabilities, while Messages will support personalized chat backgrounds. Meanwhile, Xcode is receiving generative AI tools to help developers code more efficiently, with options to integrate external models such as ChatGPT.

Expanding Apple Intelligence

Apple Intelligence is becoming more powerful and more accessible. A new feature, Live Translation, will provide real-time multilingual conversation support in text messages, phone calls, and FaceTime using on-device AI models.

Apple Pay is also getting smarter—users will soon be able to track shipments even for purchases made outside the Apple ecosystem. In addition, Image Playground now supports AI-generated imagery using OpenAI’s ChatGPT.

Perhaps most notably, Apple is opening its on-device foundational AI model to developers via a new Foundation Models framework. This initiative allows app creators to develop intelligent and private offline experiences within their software.

Also read: AI in the Break Room: How Workplace Tech Is Changing Human Interactions

Smarter Visual Understanding

A new tool under the Apple Intelligence umbrella—Visual Intelligence—aims to help users better understand what’s on their screen. Whether viewing products, photos, or events, the feature can surface relevant suggestions, such as adding an event to your calendar or locating similar items through services like Google and Etsy.

This functionality will be triggered by the same shortcut currently used to capture a screenshot, streamlining its integration into daily use.

Jijo George

Jijo is an enthusiastic fresh voice in the blogging world, passionate about exploring and sharing insights on a variety of topics ranging from business to tech. He brings a unique perspective that blends academic knowledge with a curious and open-minded approach to life.