Skip to main content
added 317 characters in body
Source Link
unknownSPY
  • 244
  • 2
  • 12

I am developing a Mobile game for Android and iOS on Unity.

Currently Im in the optimisation stage of development and trying to work out what is the best way to detect device capabilities, with the aim of then setting an appropriate quality setting.

OS/platform detection is simple, but how can I detect between a high performance android device and a low or medium performance device at runtime?

I did find this question that is similar https://stackoverflow.com/q/21565468/4148676 but the question is old, the answers not great and hoping there's a better way in the last 4 years of solving the problem.

The only other solution I have is monitor the average FPS and adjust the quality based on the frame rate.

Any advice would be appreciated.

I am developing a Mobile game for Android and iOS on Unity.

Currently Im in the optimisation stage of development and trying to work out what is the best way to detect device capabilities, with the aim of then setting an appropriate quality setting.

OS/platform detection is simple, but how can I detect between a high performance android device and a low or medium performance device at runtime?

Any advice would be appreciated.

I am developing a Mobile game for Android and iOS on Unity.

Currently Im in the optimisation stage of development and trying to work out what is the best way to detect device capabilities, with the aim of then setting an appropriate quality setting.

OS/platform detection is simple, but how can I detect between a high performance android device and a low or medium performance device at runtime?

I did find this question that is similar https://stackoverflow.com/q/21565468/4148676 but the question is old, the answers not great and hoping there's a better way in the last 4 years of solving the problem.

The only other solution I have is monitor the average FPS and adjust the quality based on the frame rate.

Any advice would be appreciated.

Source Link
unknownSPY
  • 244
  • 2
  • 12

How best to detect device performance capabilities and set quality setting accordingly when using Unity

I am developing a Mobile game for Android and iOS on Unity.

Currently Im in the optimisation stage of development and trying to work out what is the best way to detect device capabilities, with the aim of then setting an appropriate quality setting.

OS/platform detection is simple, but how can I detect between a high performance android device and a low or medium performance device at runtime?

Any advice would be appreciated.