Project

General

Profile

Bug #11264

Updated by nobu (Nobuyoshi Nakada) almost 9 years ago

Hi, 

 I'm not sure if this is a bug, or just undocumented behaviour, but here's a script to reproduce the memory leak: 

 ~~~ruby ------------------------------ 
 require 'json' 

 class MyClass 
   def to_json(*) 
     "a" * 1048576 # 1 megabytes of chars 
   end 
 end 

 class MyOther 
   def to_json(*) 
     raise "OMG" 
   end 
 end 

 1000.times do |i| # will leak up to ~ 4 gigs 
   puts i 
   JSON.dump([MyClass.new, MyClass.new, MyClass.new, MyOther.new]) rescue nil 
 end 
 ~~~ ------------------------------ 

 What's happening is that the C extension is iterating over the array to eventually dump it out to JSON. It's going through the array in order, appending to the `fbuffer` fbuffer as needed. The problem is that that the API extension point of adding a `to_json` to_json method to a class (or object), without wrapping the code in some sort of 'begin...rescue , free(buffer), re-raise' block results in the buffer never being freed. Normally this isn't too bad, except if a lot of data was appended to the buffer before the error got raised. 

 To test it against normal behaviour in the above script, take out the offending `MyOther.new` MyOther.new in the array. It should run much more smoothly this way :) 

 Note that since the `fbuffer`s fbuffers aren't GC marked (not that they should be), it isn't possible to trace this leak using `GC.stat`. GC.stat. 

 Once again, not sure if this is a bug or if we should never raise errors from custom `to_json` to_json methods (ie: always wrap them in a begin... rescue block. 

 Thanks, 

 I also reported this to the JSON gem maintainer here: https://github.com/flori/json/issues/251

Back