
Gemini starts booking, ordering and planning directly on your phone
Google and Samsung are switching on Gemini-powered task automation and Ask Maps, letting AI operate apps and plan trips in natural language — reshaping how phones handle commerce, privacy and work.
Google is quietly turning your phone into an agent that can act on your behalf, not just talk back to you. Gemini can now open apps, fill in forms and line up bookings, while a new Ask Maps experience brings the same AI into one of the most-used apps on the planet.
Together, they mark a shift from chatbots to AI that directly participates in everyday transactions.
What’s Rolling Out Now
On Samsung’s new Galaxy S26 line, Google and Samsung are rolling out a Gemini “screen automation” capability that lets the assistant open and control apps to complete multi-step tasks like ordering food, booking rides and managing deliveries. Reviews of the feature show Gemini navigating real apps, adding items to carts and stepping through checkout flows after the user approves the final confirmation screen. Android Central reports that the feature is arriving via a software update and is initially limited to the Galaxy S26 series in select regions.
The same underlying “agentic” behavior has been in testing on Google’s side as a screen automation feature that lets Gemini place orders or book rides by taking over compatible apps and tapping through the UI instead of just sending you links. That behavior was first spotted in Google app betas as an upcoming capability designed to “interact and automate basic actions on behalf of a user, such as placing orders or booking rides.” Moneycontrol noted the feature as a time-saver that effectively turns Gemini into an app operator.
At the same time, Google is overhauling Maps with a Gemini-powered Ask Maps feature that lets users plan trips and outings in plain language. You can ask for “a two‑day itinerary with kid‑friendly museums and coffee shops near parks” and get conversational suggestions backed by Maps’ database of hundreds of millions of places and contributor reviews. AP News reports that Ask Maps is starting on iOS and Android in the U.S. and India before expanding to more countries.
Why It Matters: Commerce, Labor and Risk
For consumers, this promises less time tapping through screens and more “set it and forget it” errands: reordering groceries, grabbing a ride home or stitching together a weekend away becomes a single instruction. That convenience also shifts power toward the platforms that control the agents, since they decide which apps and merchants are supported or prioritized. Google executives have already declined to say whether Ask Maps results could eventually be influenced by paid promotion, according to AP News.
The model also echoes what Google’s broader AI Mode in search has begun to do on the web: not only find restaurant or event options, but walk through ticket and reservation flows for you, then present curated choices and handle the booking. As Yahoo/Tech has detailed, the idea is a search experience where agents perform the tedious steps.
There are clear privacy and fraud stakes. Granting an AI permission to read screens, parse messages and control apps concentrates sensitive data and payment access in a single system, something security researchers have already flagged as a new attack surface in critiques of Gemini’s app access model. Malwarebytes has warned that expanded app connections and message access increase the blast radius of any compromise or misconfiguration.
And as AI agents become capable of doing the work of concierges, travel planners and segments of gig work — from coordinating deliveries to creating detailed itineraries in Maps — they could gradually erode demand for human intermediaries, even as they create new expectations that every service be “agent‑ready.”
The New Default Interface
Ask Maps may be the more visible change in the short term, dropping a conversational AI layer into an app with more than 2 billion users. But the more profound shift is Gemini acting inside other apps, where the UI becomes something the AI sees and manipulates on your behalf.
If the rollout holds and expands beyond early Galaxy and Pixel devices, the default way to interact with services may no longer be tapping through their interfaces — it will be telling your phone what outcome you want, and letting an invisible agent handle the rest.
Tags
