I opened an issue about this in Unicode-math's github repository. However, I am not so sure it is actually a bug, or I just did something wrong.
Here is a minimal example,
\documentclass{article} \usepackage{unicode-math} \setmathfont[version=Asana] {Asana Math} \setmathfont[version=Cambria] {Cambria Math} \setmathfont[version=LatinModern]{Latin Modern Math} \setmathfont[version=Minion] {Minion Math} \setmathfont[version=XITS] {XITS Math} \def\testprime{f'x'f''''x''''\quad f\prime x\prime f\qprime x\qprime} \setlength{\parindent}{0pt} \begin{document} \fontsize{36}{36}\selectfont Asana \mathversion{Asana} \[ \testprime \] Cambria \mathversion{Cambria} \[ \testprime \] LatinModern \mathversion{LatinModern} \[ \testprime \] Minion \mathversion{Minion} \[ \testprime \] XITS \mathversion{XITS} \[ \testprime \] \end{document} 
It seems that the ASCII input was transformed into supscripts while \prime etc are not.
With Asana Math the ASCII version looks fine while the \prime version looks horrible. It looks like in Asana Math primes are designed as normal glyph, they need to be raised to supscripts to works well.
With Latin Modern Math, it looks horrible for both case. I think it is a font problem for LM case.
For other three fonts, the \prime looks good while using ASCII input they looks terrible. In these fonts, the primes seems to be designed as a supscripts glyph, and raising them even more does not looks good.
Is there anyway to modify the behavior of the ASCII input?
In addition, with Latin Modern Math, neither '''' or \qprime works, they all produce nothing. I thought '''' was supposed to use negative kern to fake the \qprime when it is not available.
Update: Thanks for both @KhaledHosny and @LeoLiu's answers. Now it seems to be more complicated. Using the old TeX way to typeset primes get the position right when both prime and subscripts are presented. However, to conform the Unicode standard, I think the fonts should design the primes as superscript glyphs, which means they will look really bad when raised in superscript position. Right now my temporary solution is to use \prime etc. and add a negative kern between primes and subscripts.
Update Thanks for @LeoLiu's answer, which worked fine for me before. Here is just an update for newer versions of unicode-math. The main change is that the prefix of unicode-math's internal macros is now __um instead of um.
\ExplSyntaxOn \group_begin: \char_set_catcode_active:N \' \char_set_catcode_active:N \` \char_set_catcode_active:n {"2032} \char_set_catcode_active:n {"2033} \char_set_catcode_active:n {"2034} \char_set_catcode_active:n {"2057} \char_set_catcode_active:n {"2035} \char_set_catcode_active:n {"2036} \char_set_catcode_active:n {"2037} \cs_gset:Nn \__um_define_prime_chars: { \cs_set_eq:NN ' \__um_scan_prime: \cs_set_eq:NN ^^^^2032 \__um_scan_prime: \cs_set_eq:NN ^^^^2033 \__um_scan_dprime: \cs_set_eq:NN ^^^^2034 \__um_scan_trprime: \cs_set_eq:NN ^^^^2057 \__um_scan_qprime: \cs_set_eq:NN ` \__um_scan_backprime: \cs_set_eq:NN ^^^^2035 \__um_scan_backprime: \cs_set_eq:NN ^^^^2036 \__um_scan_backdprime: \cs_set_eq:NN ^^^^2037 \__um_scan_backtrprime: } \group_end: \ExplSyntaxOff Also it seems that newer versions of CambriaMath and latest unicode-math together no longer need this fix. Though it is still needed for MinionMath.
