Troubleshoot Android Performance Tuner issues and FAQs

 

If you're having trouble integrating, using or understanding Android Performance Tuner, the information below may help.

Common issues when getting started

Android Performance Tuner is not yet fully integrated.

If the Performance insights page is an introductory message with the heading, 'Get performance insights with Android Performance Tuner', your integration is incomplete.

Complete integration and then upload your game to Play Console, as described on the Android developers site.

Android Performance Tuner is fully integrated, but your app is not yet released.

If the Performance insights page is prompting you to create a release, Android Performance Tuner is fully integrated, but you need to publish your app.

Release your app to a test track or publish it on Google Play. For more information on releasing, go to Prepare and roll out a release.

Your app is receiving error messages on upload.

When you upload your app to Play, some final checks are run to validate your configuration. If you receive a warning message, please review the details, ensuring that you have completed the checklist for your relevant integration path.

Your app is released but has insufficient data.

The amount of data collected needs to reach a minimum threshold before it’s displayed in Play Console. However, if you release your Android App Bundle to an internal testing track, this minimum threshold does not apply. This means that you can validate your setup in-house before publishing your app on Google Play.

FAQs

How does Android Performance Tuner affect my frame rate distribution? Could it slow it down? The plug-in seems to have changed it.

Our testing indicates that Android Performance Tuner has a negligible (<1%) impact on frame time performance. If you are using the frame pacing API to provide the frame timings for Android Performance Tuner, then you will see that your frame times have become much more consistent. This is a good thing as it will reduce microstuttering. The number of slow frames should remain largely unchanged.

Can I use this before I launch?

Yes, if you release your app on an internal test track (which supports up to 100 users), then your performance data will be available in Android vitals. Note that when your app is on an internal test track, we show all data regardless of session count. However, when you promote your app to closed testing, open testing or production, we only show data once you have reached a statistically significant session count. This means that when you promote your app from test to production, there could be a short window of time where you do not see any data, before your production app reaches the necessary level of adoption.

Do I lose any data if I change my target frame rate?

Slow and fast frame metrics are tied to target frame rate; if this rate changes, the issues and opportunities change too. However, the underlying frame time data itself does not change.

Can I turn off data collection?

No, turning data collection off is not an option in Play Console.

I determine my users’ quality settings at runtime. How does this affect the data that I see?

We rely on you to accurately report the quality settings at runtime, and for them to be in line with the quality levels that you defined during your integration. Otherwise, sessions will be classified as 'unknown' quality level.

I allow my users to override their quality settings. How does this affect the data that I see?

At the moment we don’t track user-driven changes, but we plan to do this in the future. Until then, these changes could appear in two different ways; depending on whether or not the settings that users select are part of the quality levels that you pre-configured. If they are, then the sessions will be reported on the appropriate quality level. Otherwise, they would appear on 'unknown'. One possible consequence is that some device models may be reported on more than one quality level.

What happens if a device model is running on more than one quality level?

A device model can appear multiple times in the chart and tables, if sessions on the device model are reported on more than one quality level.

Why would a device model be running on more than one quality level?

This could happen if: 

  • the user changed the quality level,
  • the quality level was dynamically set at a more granular level than just device model, 
  • or the quality level was changed remotely without a new release.

Was this helpful?

How can we improve it?
Search
Clear search
Close search
Google apps
Main menu
7923418132071099729
true
Search Help Centre
true
true
true
true
true
92637
false
false