2

I'm writing a routine in Swift that needs to try to convert an arbitrary integer to a UnicodeScalar or return an error. The constructor UnicodeScalar(_:Int) does the job for valid code points, but it crashes when passed integers that are not valid code points.

Is there a Swift (or Foundation) function I can call to pre-flight that an integer i is a valid Unicode code point and won't cause UnicodeScalar(i) to crash?

2 Answers 2

3

Update for Swift 3:

UnicodeScalar has a failable initializer now which verifies if the given number is a valid Unicode code point or not:

if let unicode = UnicodeScalar(0xD800) { print(unicode) } else { print("invalid") } 

(Previous answer:) You can use the built-in UTF32() codec to check if a given integer is a valid Unicode scalar:

extension UnicodeScalar { init?(code: UInt32) { var codegen = GeneratorOfOne(code) // As suggested by @rintaro var utf32 = UTF32() guard case let .Result(scalar) = utf32.decode(&codegen) else { return nil } self = scalar } } 

(Using ideas from https://stackoverflow.com/a/24757284/1187415 and https://stackoverflow.com/a/31285671/1187415.)

Sign up to request clarification or add additional context in comments.

1 Comment

GeneratorOfOne(code) is slightly better than [code].generate()
2

The Swift documentation states

A Unicode scalar is any Unicode code point in the range U+0000 to U+D7FF inclusive or U+E000 to U+10FFFF inclusive.

The UnicodeScalar constructor does not crash for all values in those ranges in Swift 2.0b4. I use this convenience constructor:

extension UnicodeScalar { init?(code: Int) { guard (code >= 0 && code <= 0xD7FF) || (code >= 0xE000 && code <= 0x10FFFF) else { return nil } self.init(code) } } 

2 Comments

Good question and solution! For the sake of completeness, I have provided an alternative answer.
That feature is now "built-in" with Swift 3, see below.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.