I have an enum named ProgrammingLanguage
:
enum ProgrammingLanguage {
case Swift, Haskell, Scala
}
Now I have a class named Programmer
with the following property:
let favouriteLanguages: ProgrammingLanguage = .Swift
Seeing how a programmer could have several favourite languages, I'd thought it'd be nice to write something like this:
let favouriteLanguages: ProgrammingLanguage = [.Swift, .Haskell]
After a bit of research, I realized that I need to conform to OptionSetType
, but in doing so, I've raise the following 3 errors:
ProgrammingLanguage does not conform to
SetAlgebraType
OptionSetType
RawRepresentable
When I saw the Raw Representable error, I immediately thought of associated types for enums. I wanted to be able to print the enum value anyway, so I changed my enum signature to the following:
case ProgrammingLanguage: String, OptionSetType {
case Swift, Haskell, Scala
}
This silenced 2 of the warnings. But I'm still left with one which is that I don't conform to protocol SetAlgebraType
.
After a bit of trial and error, I found out having the associated type of the enum as Int
fixed it (which makes sense, since the RawRepresentable
protocol requires you to implement an initializer of the signature init(rawValue: Int)
). However, I'm unsatisfied with that; I want to be able to get the String representation of the enum easily.
Could someone advise me how I can do this easily, and why OptionSetType
requires an Int
associated type?
Edit:
The following declaration compiles correctly, but errors at runtime:
enum ProgrammingLanguage: Int, OptionSetType {
case Swift, Scala, Haskell
}
extension ProgrammingLanguage {
init(rawValue: Int) {
self.init(rawValue: rawValue)
}
}
let programmingLanguages: ProgrammingLanguage = [.Swift, .Scala]
Edit: I'm surprised at my former self for not saying this upfront at the time, but... instead of trying to force other value types into the OptionSet
protocol (Swift 3 removed Type
from the name), it's probably better to consider the API where you use those types and use Set
collections where appropriate.
OptionSet
types are weird. They are both collections and not collections — you can construct one from multiple flags, but the result is still a single value. (You can do some work to figure out a collection-of-single-flags equivalent to such a value, but depending on the possible values in the type it might not be unique.)
On the other hand, being able to have one something, or more than one unique somethings, can be important to the design of an API. Do you want users to say they have more than one favorite, or enforce that there's only one? Just how many "favorites" do you want to allow? If a user claims multiple favorites, should they be ranked in user-specific order? These are all questions that are hard to answer in an OptionSet
-style type, but much easier if you use a Set
type or other actual collection.
The rest of this answer a) is old, using Swift 2 names, and b) assumes that you're trying to implement OptionSet
anyway, even if it's a bad choice for your API...
See the docs for OptionSetType
:
Supplies convenient conformance to
SetAlgebraType
for any type whoseRawValue
is aBitwiseOperationsType
.
In other words, you can declare OptionSetType
conformance for any type that also adopts RawRepresentable
. However, you gain the magic set-algebra syntax support (via operators and ArrayLiteralConvertible
conformance) if and only if your associated raw value type is one that conforms to BitwiseOperationsType
.
So, if your raw value type is String
, you're out of luck — you don't gain the set algebra stuff because String
doesn't support bitwise operations. (The "fun" thing here, if you can call it that, is that you can extend String
to support BitwiseOperationsType
, and if your implementation satisfies the axioms, you can use strings as raw values for an option set.)
Your second syntax errors at runtime because you've created an infinite recursion — calling self.init(rawValue:)
from init(rawValue:)
keeps gong until it blows the stack.
It's arguably a bug (please file it) that you can even try that without a compile time error. Enums shouldn't be able to declare OptionSetType
conformance, because:
The semantic contract of an enum is that it's a closed set. By declaring your ProgrammingLanguage
enum you're saying that a value of type ProgrammingLanguage
must be one of Swift
, Scala
, or Haskell
, and not anything else. A value of "Swift and Scala" isn't in that set.
The underlying implementation of an OptionSetType
is based on integer bitfields. A "Swift and Haskell" value, ([.Swift, .Haskell]
) is really just .Swift.rawValue | .Haskell.rawValue
. This causes trouble if your set of raw values isn't bit-aligned. That is, if .Swift.rawValue == 1 == 0b01
, and .Haskell.rawValue == 2 == 0b10
, the bitwise-or of those is 0b11 == 3
, which is the same as .Scala.rawValue
.
OptionSetType
conformance, declare a struct.And use static let
to declare members of your type.
And pick your raw values such that members you want to be distinct from possible (bitwise-or) combinations of other members actually are.
struct ProgrammingLanguage: OptionSetType {
let rawValue: Int
// this initializer is required, but it's also automatically
// synthesized if `rawValue` is the only member, so writing it
// here is optional:
init(rawValue: Int) { self.rawValue = rawValue }
static let Swift = ProgrammingLanguage(rawValue: 0b001)
static let Haskell = ProgrammingLanguage(rawValue: 0b010)
static let Scala = ProgrammingLanguage(rawValue: 0b100)
}
Good ways to keep your values distinct: use binary-literal syntax as above, or declare your values with bit shifts of one, as below:
static let Swift = ProgrammingLanguage(rawValue: 1 << 0)
static let Haskell = ProgrammingLanguage(rawValue: 1 << 1)
static let Scala = ProgrammingLanguage(rawValue: 1 << 2)