At a private Fortune Brainstorm Tech dinner, Qualcomm Chief Executive Cristiano Amon discussed the future of “physical AI,” a push to run artificial intelligence on devices rather than in distant data centers. His remarks signal how the next phase of AI may reach phones, PCs, vehicles, and headsets, where speed, privacy, and battery life matter most. The conversation reflects rising pressure across the tech sector to move AI closer to users as costs for cloud services grow and demand for real-time features rises.
What “Physical AI” Means
Physical AI describes models that run on the device that people hold or install, rather than exclusively in the cloud. That can include smartphones, laptops, cars, wearables, and industrial systems. The goal is to make AI feel instant and personal, without sending each request through a network connection.
Running models locally reduces latency and can improve privacy, since sensitive data may not leave the device. It also helps when connections are weak or unavailable. But it requires chips that can handle heavy compute loads within tight power limits.
Why Qualcomm Cares
Qualcomm built its business on mobile chips designed for efficiency. That experience is now central to on-device AI. The company has been investing in neural processing units and software tools so developers can run larger and more capable models on consumer hardware.
Phones and PCs are the first targets. Voice assistants, image editing, translation, and personal agents work better when they respond in milliseconds. In cars, driver monitoring, computer vision, and infotainment also benefit from local processing. Mixed reality and wearables, where comfort and battery life are critical, are another fit for this approach.
Industry Momentum and Trade-Offs
Interest in on-device AI has grown as generative models strain cloud capacity and add recurring costs. Companies are seeking a split model, where smaller tasks run locally and larger ones shift to the cloud. That hybrid approach can trim cost and improve reliability.
Physical AI still faces engineering limits. Devices must balance heat, battery drain, and memory against user expectations for quality. Developers want consistent tools across phones, PCs, and cars, but hardware varies widely. Security also matters, since AI features may touch personal content on the device.
- Power and thermal limits cap model size on consumer hardware.
- Memory bandwidth and storage affect speed and accuracy.
- Standardized tooling is needed for reliable app performance.
Competition and Partnerships
Chipmakers across the market are racing to support on-device AI. Mobile rivals are shipping their own AI features. PC platforms are adding dedicated acceleration. Automakers and suppliers are expanding in-cabin processing. Software frameworks that target multiple chips could decide where developers focus their efforts.
Partnerships with handset makers, PC brands, and car companies will shape progress. Carriers and cloud platforms will also influence how workloads split between edge and cloud. If physical AI reduces network traffic, pricing and business models may shift.
Use Cases Taking Shape
Several consumer features are set to benefit first. Photo and video tools can generate edits offline. Transcription and translation can happen in real time without a connection. Personal assistants can summarize messages and draft replies while keeping data local.
In vehicles, local AI can monitor driver attention, adapt interfaces, and aid navigation. In industrial settings, edge AI can spot defects on production lines or detect hazards without sending data offsite.
What To Watch Next
Three signals will show how fast physical AI matures. First, the speed and efficiency of next-generation chips. Second, whether developers can bring popular models to consumer devices without major rewrites. Third, whether users notice faster, more private features and keep them on by default.
Cristiano Amon’s focus on physical AI highlights a clear industry shift. AI that runs where people are stands to lower costs, reduce delays, and protect data. The hard work now lies in squeezing bigger models into small, efficient devices, and making tools that work across many products. Expect steady gains in everyday apps, with hybrid designs that borrow the best of the edge and the cloud.






