How to Decode Human-Readable JSON Strings to Integer-Based Swift.OptionSet

I want to decode JSON that’s human readable but still represents a Swift.OptionSet. Traditionally, OptionSets are implemented with an integer-based rawValue, because that gives you the set algebra for free.

Here’s my type:

struct Selection: OptionSet {
    typealias RawValue = Int
    let rawValue: Int

    init(rawValue: Int) {
        self.rawValue = rawValue
    }

    static let selected = Selection(rawValue: 1 << 0)
    static let all = Selection(rawValue: 1 << 1)
}

Here, all represents the whole text of a document, for example, and selected stands for the current selected text, if any.

The JSON to configure the required input should be:

{
"selection: [ "all" ]
}

To get Swift.Decodable for free, I would usually have to use the integer values in the JSON array. The JSON should be human-readable, though, so instead of writing 2, people should be able to write "all".

Here’s a very direct, very naive approach that works just fine:

extension Selection: Decodable {
    init(from decoder: Decoder) throws {
        let container = try decoder.unkeyedContainer()
        var result: Selection = []
        while !container.isAtEnd {
            let optionName = try container.decode(String.self)
            switch try container.decode(String.self) {
            case "selected":
                result.insert(.selected)
            case "all":
                result.insert(.all)
            case let unrecognized:
                let context = DecodingError.Context(
                  codingPath: decoder.codingPath,
                  debugDescription: "Selection not recognized: \(unrecognized)")
                throw DecodingError.valueNotFound(String.self, context)
            }
        }
        self = result
    }
}

The string-to-number-mapping has to be encoded somewhere. Here, it’s in the switch-case statement.

You can also provide a proper map data type instead:

extension Selection: Decodable {
    init(from decoder: Decoder) throws {
        let container = try decoder.unkeyedContainer()
        var result: Selection = []
        while !container.isAtEnd {
            let optionName = try container.decode(String.self)
            guard let selection = Selection.mapping[optionName] else {
                let context = DecodingError.Context(
                  codingPath: decoder.codingPath,
                  debugDescription: "Selection not recognized: \(optionName)")
                throw DecodingError.valueNotFound(String.self, context)
            }
            result.insert(selection)
        }
        self = result
    }

    private static let mapping: [String : Selection] = [
        "selected" : .selected,
        "all" : .all
    ]
}

Which one to use?

It really does not matter. There’s no difference in functionality, it’s just style.

I prefer the mapping data type, because I don’t have to touch the init(from:) initializer anymore, and I find the data type declaration a bit easier on the eye. There’s no context at all – it’s just a dictionary, and that’s it.

The case let unrecognized was nice. I like pattern matching. But the guard clause is super simple, too. There’s no real ground to argue for one or the other, as is often the case in programming. So pick whichever you like best, and happy coding.

Update 2022-09-27: There’s been a discussion on the Swift forums about an OptionSet initializer from JSON strings you should check out. It helped me figure out how to map an array of strings into a single OptionSet integer value.