The CGSize returned by sizeWithFont:minFontSize:actua开发者_Python百科lFontSize:forWidth:lineBreakMode: contains always the same height. Why is that, and is there a way around this?
I want to align a string vertically and it may not be truncated, unless it can't fit on a single line using the minimum font size. So I try to use this method to get the line height but it always returns 57px no matter what the actualFontSize is.
Any ideas?
Once you have the actualFontSize
from sizeWithFont:minFontSize:actualFontSize:forWidth:lineBreakMode:
Then you can calculate the height required to render using:
CGFloat actualHeight = [font fontWithSize:actualFontSize].ascender-[font fontWithSize:actualFontSize].descender;
(I found this page useful in understanding the various size values relating to fonts.)
I believe you are misunderstanding the actualFontSize
parameter. This is an out parameter only, not an in parameter. It does not matter what value you provide for actualFontSize
; if the pointer is non-NULL, then it will be overwritten with the font size used in determining the returned size. The returned actual font size will be between the minFontSize:
and the size of the UIFont
instance provided as the argument to the sizeWithFont:
portion of the selector.
You also must be sure to send this message to the string you intend to render. You would always get the same value, for example, if you are asking a dummy string that's less than a line long how much height it would take for the supplied width.
If you update your question with the actual code used in calling the method, everyone would be better able to help you.
精彩评论