What time complexity (in big-O notation) is provided by the ES6 specification for the Keyed Collections (Set, Map, WeakSet, and WeakMap)?
My expectation, and I expect that of most developers, is that the specifications and implementations would use widely accepted performant algorithms, in which case Set.prototype.has
, add
and delete
to all be O(1) in the average case. The same for the Map
and Weak–
equivalents.
It is not entirely apparent to me whether the time complexity of the implementations was mandated e.g. in ECMAScript 2015 Language Specification - 6th Edition — 23.2 Set Objects.
Unless I misunderstand it (and it's certainly very possible I do), it looks the ECMA spec mandates that the implementations (e.g. Set.prototype.has
) are to use a linear time (O(n)) algorithm. It would strike me as exceedingly surprising that more performant algorithms would not be mandated or even permitted by the spec, and I would be very interested in an explanation for why this is the case.
Right from that very paragraph your linked to:
Set objects must be implemented using [mechanisms] that, on average, provide access times that are sublinear on the number of elements in the collection.
You will find the same sentence for Maps, WeakMaps and WeakSets.
It looks the ECMA spec mandates that the implementations (e.g. Set.prototype.has) are to use a linear time (
O(n)
) algorithm.
No:
The data structures used in this
Set
objects specification is only intended to describe the required observable semantics of Set objects. It is not intended to be a viable implementation model.
The observable semantics are mostly related to the predictable iteration order (which still can be implemented efficient and fast). It is indeed expected by the specification that an implementation uses a hash table or something similar with constant access, though trees (with logarithmic access complexity) are allowed as well.