Feature #8998
closedstring keys for hash literals should use fstrings
Description
While we're introducing optimizations from frozen strings,
string keys inside hashes should be frozen at the compiler level
to prevent duplication.
a = { "ABC" => :t }
b = { "ABC" => :t }
# the following ought to print true
p(a.keys[0].object_id == b.keys[0].object_id)
This should introduce no incompatibilities and be transparent to users of
older rubies.
Files
Updated by normalperson (Eric Wong) about 11 years ago
- File hash_aset_fstring.diff hash_aset_fstring.diff added
Proposed patch to partially implement this, but I get segfaults (backtrace/dump coming) with "make check".
There probably needs to be some RGenGC-related calls/fixes for this.
Note: this patch does not avoid the short-lived, unfrozen string literal.
I don't currently know the parser/compiler code well enough to make
that optimization.
Updated by normalperson (Eric Wong) about 11 years ago
- File hash_aset_fstring_check_fail.txt hash_aset_fstring_check_fail.txt added
- File hash_aset_fstring_check_bt.txt hash_aset_fstring_check_bt.txt added
attaching output of "make check" and gdb backtrace
Updated by normalperson (Eric Wong) about 11 years ago
I think my failed patch exposes a bug with lazy sweep + rb_fstring.
Lazy sweep GC means the element remains in the frozen_string hash,
fstr1 = rb_fstring(str)
fstr1 goes out of scope
GC mark runs ...
fstr1 is eligible for lazy sweep
fstr2 = rb_fstring(str)
fstr2 is identical to fstr1
fstr1 is swept
fstr2 use attempted -> crash
Updated by normalperson (Eric Wong) about 11 years ago
The issue I had with my original patch is fixed by nobu with r43210
Thanks!
So my proposed patch should be safe to apply, but it's only a partial
implementation of the idea for this feature.
I don't think I'll have time+energy to implement this for the
parser/compiler.
Updated by normalperson (Eric Wong) about 11 years ago
Eric Wong normalperson@yhbt.net wrote:
So my proposed patch should be safe to apply, but it's only a partial
implementation of the idea for this feature.
While that's true, I haven't been able to measure performance
improvements from this (no real apps which are very hash-dependent).
Can anybody else?
Updated by tmm1 (Aman Karmani) about 11 years ago
I didn't realize MRI already froze string keys in hashes.
Your patch reduces long-lived strings in our rails app by ~11%:
$ ruby -rconfig/environment -e' GC.start; p ObjectSpace.count_objects[:T_STRING] '
173956
$ ruby -rconfig/environment -e' GC.start; p ObjectSpace.count_objects[:T_STRING] '
154750
I am inclined to merge it.
Updated by tmm1 (Aman Karmani) about 11 years ago
- if (!OBJ_FROZEN(str))
*key = rb_fstring(str);
Do you know why the OBJ_FROZEN check is required here?
I tried to investigate and it appears this function is sometimes invoked with symbols.
Updated by tmm1 (Aman Karmani) about 11 years ago
- Status changed from Open to Closed
- % Done changed from 0 to 100
This issue was solved with changeset r43870.
Eric, thank you for reporting this issue.
Your contribution to Ruby is greatly appreciated.
May Ruby be with you.
- hash.c (hash_aset_str): Use rb_fstring() to de-duplicate hash string
keys. Patch by Eric Wong. [Bug #8998] [ruby-core:57727] - test/ruby/test_hash.rb (class TestHash): test for above.
Updated by normalperson (Eric Wong) about 11 years ago
"tmm1 (Aman Gupta)" ruby@tmm1.net wrote:
Issue #8998 has been updated by tmm1 (Aman Gupta).
Cool, I didn't check object lifetimes.
Any measurable impact on speed?
- if (!OBJ_FROZEN(str))
*key = rb_fstring(str);
Do you know why the OBJ_FROZEN check is required here?
Some apps use frozen constant strings to speed up hash assignment for
common string keys. No point in having the extra hash lookup if it was
already a pre-frozen string (and it could break existing behavior).
Ideally it would be nice to transparently freeze strings used for hash
assignment in the parser so apps wouldn't have to add the ugly frozen
constants. Mongrel, and most probably every server descended from
Mongrel uses pre-frozen constants to speed up hash assignment.
I tried to investigate and it appears this function is sometimes
invoked with symbols.
Really? Yikes! That's strange. I didn't expect that, and I'm
not sure how it could be from reading rb_hash_aset()...
Updated by tmm1 (Aman Karmani) about 11 years ago
Seeing some occasional segfaults in CI after this change. Not quite sure what's up yet.
http://c5632.rubyci.org/~chkbuild/ruby-trunk/log/20131127T070302Z.log.html.gz
#3 0x0063fb97 in rb_bug (fmt=0x67363d "Segmentation fault") at error.c:341
#4 0x00557a16 in sigsegv (sig=11, info=0x95f9c7c, ctx=0x95f9cfc)
at signal.c:699
#5
#6 search_nonascii (p=0x1c <Address 0x1c out of bounds>,
len=, enc=0x95df180) at string.c:202
#7 coderange_scan (p=0x1c <Address 0x1c out of bounds>,
len=, enc=0x95df180) at string.c:232
#8 0x0057ab6e in rb_enc_str_coderange (str1=184779220, str2=173130540)
at string.c:376
#9 rb_str_comparable (str1=184779220, str2=173130540) at string.c:2403
#10 0x0057bb32 in rb_str_hash_cmp (a=184779220, b=173130540) at string.c:2367
#11 fstring_cmp (a=184779220, b=173130540) at string.c:161
#12 0x00561257 in find_entry (table=0x9603aa0, key=184779220,
value=0xbf9a818c) at st.c:387
#13 st_lookup (table=0x9603aa0, key=184779220, value=0xbf9a818c) at st.c:425
#14 0x00568062 in rb_fstring (str=184779220) at string.c:144
#15 0x004b00da in hash_aset_str (key=0xbf9a8234, val=0xbf9a8218,
arg=3214574164, existing=0) at hash.c:1267
#16 hash_aset_str_insert (key=0xbf9a8234, val=0xbf9a8218, arg=3214574164,
existing=0) at hash.c:1273
#17 0x00561dba in st_update (table=0xa069f50, key=184779220,
func=0x4b0040 <hash_aset_str_insert>, arg=3214574164) at st.c:908
#18 0x004b4c0f in tbl_update (hash=184779600, key=184779220, val=171144140)
at hash.c:380
#19 rb_hash_aset (hash=184779600, key=184779220, val=171144140) at hash.c:1316
Updated by normalperson (Eric Wong) about 11 years ago
"tmm1 (Aman Gupta)" ruby@tmm1.net wrote:
Seeing some occasional segfaults in CI after this change. Not quite
sure what's up yet.http://c5632.rubyci.org/~chkbuild/ruby-trunk/log/20131127T070302Z.log.html.gz
#3 0x0063fb97 in rb_bug (fmt=0x67363d "Segmentation fault") at error.c:341
#4 0x00557a16 in sigsegv (sig=11, info=0x95f9c7c, ctx=0x95f9cfc)
at signal.c:699
#5
#6 search_nonascii (p=0x1c <Address 0x1c out of bounds>,
len=, enc=0x95df180) at string.c:202
#7 coderange_scan (p=0x1c <Address 0x1c out of bounds>,
len=, enc=0x95df180) at string.c:232
So it seems GC is not preserving the hash used by rb_fstring, still. I
wonder if something else changed since r43210 which caused this, perhaps
r43718? I've never been a GC expert, especially not with the current GC.
#8 0x0057ab6e in rb_enc_str_coderange (str1=184779220, str2=173130540)
at string.c:376
Updated by tmm1 (Aman Karmani) about 11 years ago
The regression was indeed caused by r43718, and has been resolved with r43887.
Updated by ko1 (Koichi Sasada) about 11 years ago
The regression was indeed caused by r43718, and has been resolved with r43887.
No. Before r43718, it works accidentally.
Updated by tmm1 (Aman Karmani) about 11 years ago
Oops, @ko1 (Koichi Sasada) is right.
The issue at hand is that fstrings were allowed to be shared strings. With rb_gc_mark(), the shared string was more likely to get marked (due to the mark_stack addition). But there were still cases where the shared string had already been lazy swept by the time the fstring was marked or resurrected.
With r43718, we resolved the issue once and for all by disallowing shared fstrings altogether.
Updated by headius (Charles Nutter) almost 11 years ago
I've implemented something like this for JRuby 9k. Both the interpreter and the compiler pre-freeze and dedup the keys, allowing them to go straight into literal hashes.
Performance is significantly improved by doing this:
Before:
$ jruby -rbenchmark -e "10.times { puts Benchmark.measure { hash = nil; 10_000_000.times { hash = { 'a' => 1, 'b' => 1, 'c' => 1, 'd' => 1, 'e' => 1 } } } }"
5.810000 0.100000 5.910000 ( 5.282000)
5.230000 0.030000 5.260000 ( 5.101000)
4.710000 0.020000 4.730000 ( 4.509000)
4.500000 0.030000 4.530000 ( 4.375000)
4.630000 0.030000 4.660000 ( 4.516000)
4.510000 0.020000 4.530000 ( 4.390000)
4.560000 0.030000 4.590000 ( 4.433000)
4.580000 0.030000 4.610000 ( 4.455000)
4.540000 0.020000 4.560000 ( 4.416000)
4.560000 0.030000 4.590000 ( 4.435000)
After:
$ jruby -rbenchmark -e "10.times { puts Benchmark.measure { hash = nil; 10_000_000.times { hash = { 'a' => 1, 'b' => 1, 'c' => 1, 'd' => 1, 'e' => 1 } } } }"
3.270000 0.100000 3.370000 ( 2.295000)
2.140000 0.020000 2.160000 ( 2.069000)
1.870000 0.010000 1.880000 ( 1.779000)
1.930000 0.020000 1.950000 ( 1.744000)
1.790000 0.010000 1.800000 ( 1.734000)
1.810000 0.010000 1.820000 ( 1.750000)
1.790000 0.010000 1.800000 ( 1.743000)
1.820000 0.020000 1.840000 ( 1.769000)
1.820000 0.010000 1.830000 ( 1.766000)
1.770000 0.010000 1.780000 ( 1.712000)
For reference, 2.1p0 on my system:
$ rvm ruby-2.1 do ruby -rbenchmark -e "10.times { puts Benchmark.measure { hash = nil; 10_000_000.times { hash = { 'a' => 1, 'b' => 1, 'c' => 1, 'd' => 1, 'e' => 1 } } } }"
9.060000 0.040000 9.100000 ( 9.098937)
9.030000 0.030000 9.060000 ( 9.064616)
8.980000 0.020000 9.000000 ( 9.007513)
9.050000 0.030000 9.080000 ( 9.078869)
9.250000 0.040000 9.290000 ( 9.292398)
9.020000 0.030000 9.050000 ( 9.053593)
9.080000 0.030000 9.110000 ( 9.113024)
8.970000 0.020000 8.990000 ( 8.986615)
8.870000 0.010000 8.880000 ( 8.884534)
8.940000 0.010000 8.950000 ( 8.954753)
When I have the commit done (needs a little cleanup...only a couple hours work at the moment) I will link here in case it is useful.
Updated by headius (Charles Nutter) almost 11 years ago
I ran into a small snag so I think it will be useful to have my JRuby commits here.
First, a background commit... adding a weak dedup cache: https://github.com/jruby/jruby/commit/926ca89075a4a4c84592add729531189263c143f
It's not particularly complicated; a double-weak hash synchronized for concurrency purposes.
Here's the commit that adds dedup to literal hashes with literal string keys: https://github.com/jruby/jruby/commit/aedcbc7b4024cf651a1df3bb65a8490a74a66562
This commit does all of the following:
- Interpreter and compiler logic for literal hash with literal keys will only freeze-dup them once, using the dedup cache.
- Hash will use dedup cache for all incoming strings.
The first problem I ran into was that there are tests in CSV that appear to depend on already-frozen strings not being deduplicated during hash insertion. This brought about the following failures:
-
Failure:
TestCSV::Interface#test_write_hash_with_string_keys [/Users/headius/projects/jruby/test/mri/csv/test_interface.rb:214]:
Expected "a" (oid=9726) to be the same as "a" (oid=9728). -
Failure:
TestCSV::Interface::DifferentOFS#test_write_hash_with_string_keys [/Users/headius/projects/jruby/test/mri/csv/test_interface.rb:214]:
Expected "a" (oid=9726) to be the same as "a" (oid=9746). -
Failure:
TestCSV::Row#test_to_hash [/Users/headius/projects/jruby/test/mri/csv/test_row.rb:304]:
Expected "A" (oid=9748) to be the same as "A" (oid=9750). -
Failure:
TestCSV::Row::DifferentOFS#test_to_hash [/Users/headius/projects/jruby/test/mri/csv/test_row.rb:304]:
Expected "A" (oid=9752) to be the same as "A" (oid=9750).
I fixed these by adding a check to only dedup incoming strings if they weren't already frozen. I'm not sure if that's the right course of action or if the tests should be changed.
Then I ran into a more unusual snag. In RubySpec, there are tests for Hash#compare_by_identity/= that fail if I dedup incoming strings, since they test that the same string content in two different objects can hold two different values in the Hash. Specifically:
Hash#compare_by_identity perists over #dups FAILED
Expected {"foo"=>:glark}
to equal {"foo"=>:bar, "foo"=>:glark}
/Users/headius/projects/jruby/spec/ruby/core/hash/compare_by_identity_spec.rb:77:in (root)' org/jruby/RubyBasicObject.java:1573:in
instance_eval'
org/jruby/RubyEnumerable.java:1437:in all?' org/jruby/RubyFixnum.java:269:in
times'
org/jruby/RubyArray.java:1556:in each' /Users/headius/projects/jruby/spec/ruby/core/hash/compare_by_identity_spec.rb:2:in
(root)'
/Users/headius/projects/jruby/spec/ruby/core/hash/compare_by_identity_spec.rb:1:in (root)' org/jruby/RubyKernel.java:881:in
load'
org/jruby/RubyBasicObject.java:1573:in instance_eval' org/jruby/RubyArray.java:1556:in
each'
Hash#compare_by_identity persists over #clones FAILED
Expected {"foo"=>:glark}
to equal {"foo"=>:bar, "foo"=>:glark}
/Users/headius/projects/jruby/spec/ruby/core/hash/compare_by_identity_spec.rb:84:in (root)' org/jruby/RubyBasicObject.java:1573:in
instance_eval'
org/jruby/RubyEnumerable.java:1437:in all?' org/jruby/RubyFixnum.java:269:in
times'
org/jruby/RubyArray.java:1556:in each' /Users/headius/projects/jruby/spec/ruby/core/hash/compare_by_identity_spec.rb:2:in
(root)'
/Users/headius/projects/jruby/spec/ruby/core/hash/compare_by_identity_spec.rb:1:in (root)' org/jruby/RubyKernel.java:881:in
load'
org/jruby/RubyBasicObject.java:1573:in instance_eval' org/jruby/RubyArray.java:1556:in
each'
I fixed this issue by only deduping when compare_by_identity is false, but I'm not sure if it's correct.
So the basic question that came out of my experiment:
- Should string freeze-duping in Hash#[]= always dedup, regardless of whether the string has already been frozen or the Hash has compare_by_identity == true? Or else what combination of those conditions should lead to deduping?
Updated by normalperson (Eric Wong) almost 11 years ago
- Should string freeze-duping in Hash#[]= always dedup, regardless of
whether the string has already been frozen or the Hash has
compare_by_identity == true? Or else what combination of those
conditions should lead to deduping?
MRI doesn't replace/alter the key at all if compare_by_identity is true.
Thus keys remain mutable strings in the case of compare_by_identity.
Also, in my original proposed patch, it only deduped when a new frozen
string was required, so there's almost zero chance of incompatibility.
Pre-frozen strings are used as-is when used as keys in all cases.
Btw, in addition to the current optimization and in lieu of my original
patch, I also have a proposed patch in
https://bugs.ruby-lang.org/issues/9188 which also optimizes aref along
with aset only when literal strings are used. This means we won't bust
the fstring cache with dynamically-generated strings, but string
literals in the source are not duplicated.
I'm not sure what the real-world performance impact is, but I suspect
Rails apps will benefit from that.