Technology

Google tests Gemini powered screen automation on Android phones

New Android beta reveals how Gemini may soon control apps to place orders and book rides

February 04, 2026
Google tests Gemini powered screen automation on Android phones
Google tests Gemini powered screen automation on Android phones

Google is moving closer to allowing Gemini to control actions directly on Android phones, with new details emerging from the latest Google app beta.

The Google app 17.4 beta includes references to a new Labs feature called Get tasks done with Gemini. Internally codenamed bonobo, the feature will use screen automation to let Gemini perform actions in certain Android apps.

According to the strings, Gemini could help with tasks such as placing orders or booking rides by interacting directly with on-screen elements.

This capability is also referred to as screen automation and is expected to rely on groundwork being laid in Android 16 QPR3. Google has not confirmed which apps will support it, only noting that it will work in selected applications.

Google is clear that Gemini will not be fully autonomous. Users will be able to stop the AI at any time and manually take over a task, as Gemini can make mistakes.

On privacy, Google states that when Gemini interacts with apps, screenshots may be reviewed by trained reviewers if activity tracking is enabled. Google advises users not to enter login details or payment information into Gemini chats and to avoid using screen automation for emergencies or sensitive tasks.

The beta also references a separate feature codenamed wasabi, linked to a likeness system. This appears to relate to 3D avatars used in Android XR and Google Meet. Strings suggest users may be able to access or retake their likeness through a prompt, with Google emphasising that the likeness can only be used by the individual user.