Bug #16660
closedStruct#deconstruct_keys inconsistent behavior
Description
Here is an example of a kind of surprising (at least to me) Struct#deconstruct_keys
behaviour:
klass = Struct.new(:a, :b)
s = klass.new(1, 2)
- When some keys are not recognized and the total number of the keys is not greater than the size of the struct:
s.deconstruct_keys([:a, :c])
#=> {a: 1}
- When some keys are not recognized but the total number of the keys is greater than the size of the struct:
s.deconstruct_keys([:a, :b, :c])
#=> {}
It's not clear why the first one filters unknown keys and the latter one returns an empty Hash instead of the {a: 1}
.
This behaviour was introduced in https://github.com/ruby/ruby/commit/2439948bcc0ec9daf91cf79301195e59bad49aff. Prior to that change an empty hash was returned in both cases.
Updated by palkan (Vladimir Dementyev) almost 5 years ago
(For some reason cannot edit the original post).
Prior to that change an empty hash was returned in both cases.
I think, the less confusing behavior would be to ignore the unknown keys without "failing fast" (i.e., returning an empty Hash if key is unknown).
That is:
s.deconstruct_keys([:a, :c])
#=> {a: 1}
s.deconstruct_keys([:a, :b, :c])
#=> {a: 1, b: 2}
IMO, that better reflects the pattern matching: if some key is present in one pattern (one key set), it must be present in another pattern with the same key.
I guess, that have been done for optimization (fail-fast if more keys then we have) but I believe that such optimization should be done within the pattern matching implementation, not a particular class API. The proposed behaviour would be even easier to optimize (e.g., we can re-use the deconstruction hash for keys subsets or cache the result of the key presence check).
Updated by mame (Yusuke Endoh) over 4 years ago
- Assignee set to ktsj (Kazuki Tsujimoto)
Updated by matz (Yukihiro Matsumoto) over 4 years ago
I agree that it's inconsistent and (a little bit) confusing. But it's not a bug.
I wish @ktsj (Kazuki Tsujimoto) to revisit this issue, but I honor his decision.
Matz.
Updated by ktsj (Kazuki Tsujimoto) over 4 years ago
- Status changed from Open to Rejected
My stance is "if we really should care about consistency over performance, we should remove the keys
argument itself from the specification.
Otherwise, we should do a thorough optimization using the keys
argument(*)".
Since deconstruct_keys
is supposed to be used implicitly in the context of pattern matching rather than explicitly by the user,
I don't think this inconsistency will actually cause confusion for the user.
(*) This means that the return value of deconstruct_keys
is implementation-dependent.
I believe that such optimization should be done within the pattern matching implementation, not a particular class API.
Let's compare Struct#deconstruct_keys
to Hash#deconstruct_keys
.
# Struct#deconstruct_keys(palkan's version)
s.deconstruct_keys([:a, :c])
#=> {a: 1}
# Hash#deconstruct_keys(2.7's version)
h = {a: 1, b: 2}
h.deconstruct_keys([:a, :c])
#=> {a: 1, b: 2}
Should h.deconstruct_keys([:a, :c])
also return {a: 1}
?
I don't think so.
The optimal return value for each class is different.
Therefore, we should implement optimized #deconstruct_keys
in each class.
we can re-use the deconstruction hash for keys subsets or cache the result of the key presence check
The return value can still be cached in the current specification.
For example:
case val
in {a:, b:, c:}
in {a:, b:}
end
We can compile this code as following pseudo code:
case val
in {a:, b:} && {c:}
in {a:, b:}
end