The gestures of Android P multitasking are part of the Google launcher, will they be exclusive to Pixel and Android One?

Among all the innovations of Android P presented last week during the Google I / O 2018, the most striking was his new navigation system . Google’s mobile operating system is passed to gestures to manage to multitask.

Well, it seems that this new navigation system will not be part of the operating system, or at least not with the current beta version of Android P. The new gestures and the new multitasking are part of the Pixel Launcher.

If another launcher is installed, the gestures no longer work, and if the new navigation system is deactivated to be used in the previous versions, each time you click on the Recent button, the Pixel Launcher activates in its multitasking view.

If you get to disable the Pixel Launcher by clicking on the Recent button, the classic vertical view of open/recent applications appears, the same as found in Android Nougat or Oreo. Therefore, the new design is not part of the operating system.

It seems that the multitasking view is separated from the system layer so that it can be updated through Google Play. Google has integrated the view of recent in your launcher, which can add improvements to the multitasking view without having to release system updates.

In our contact with Android P Beta, we could try this new navigation system, in which with a small gesture up we see the open applications, with a longer gesture we open the drawer of the applications and below towards the sides we could go jumping from application to application.

Are the new gestures exclusive to Pixel and Android One?

Until Google releases the final version of Android P along with its source code we will not know if the new interface and gesture system will be exclusive to the Google application launcher. If confirmed then it seems that the gestures would be exclusive to the Pixel and Android One devices.

This would also happen with the new App Actions, in which the application launcher will predict what actions we are going to do, such as calling a contact, sending a message, listening to a music album or monitoring any sports activity.

All these developments would be for the Google customization layer, with which the rest of manufacturers should create their own implementation of the gestures in its interface unless they bet on Android One.