Project

General

Profile

Actions

Feature #16471

open

Two feature requests for WeakRef: get original object, callback feature

Added by Snappingturtle (Mike O'Sullivan) over 4 years ago. Updated over 4 years ago.

Status:
Open
Assignee:
-
Target version:
-
[ruby-core:96618]

Description

I'd like to request two features for WeakRef. I'll explain what I want, then provide a real world use case.

First, add the ability to pull the original object out of the WeakRef object, something like this:

myhash = {}
myhash = WeakRef.new(myhash)
myhash = myhash.original_object()

Second, add a callback feature for when a WeakRef's object is being purged by GC. It would work something like this:

wr = WeakRef.new(myobject)

wr.on_garbage() do |wr|
    puts 'trashing'
end

# time goes by...
# myobject goes out of scope
# outputs "trashing"

Here's the specific use case I would want it for. I'm developing a database system which includes a class called Node. A Node object holds a reference to a database handle and the primary key of a record in that database. It also has methods for getting and setting values in that record. So a simplified version looks something like this:

class Node
    attr_reader :dbh
    attr_reader :pk
    
    def initialize(dbh, pk)
        @dbh = dbh
        @pk = pk
    end
    
    def set(fieldname, value)
        # a bunch of SQL to set the value
    end
    
    def get(fieldname)
        # a bunch of SQL to get the value
    end
end

node = Node.new dbh, 'abc'
node.set 'name', 'Fred'

The database object will have a node method, so you would usually get nodes like this:

node = dbh.node('abc')

Make sense so far? It's a pretty simple concept. It works, but I'd like to make a small improvement. (Whether or not it's actually an improvement is a judgement call... I expect some disagreement on this point. But work with me here.)

I'd like the database object to keep a cache of Node objects. However, the database doesn't keep those node objects alive: they can fall out of scope and get purged by the GC. However, if that cache is never purged of dead objects, it just grows bigger and bigger. I could occasionally just run a routine to work through the cache and purge dead references, but that seems very inefficient. It would be better to just have them purged as they die.

So I could implement it something like this:

class DataBase
    attr_reader :cache
    
    def initialize
        @cache = {}
    end
    
    def node(pk)
        if @cache[pk]
            # Here's where we need to get at the original object
            return @cache[pk].original_object
        else
            new_node = Node.new(self, pk)
            
            @cache[pk] = WeakRef.new(new_node)
            
            # Here's where we set the callback
            @cache[pk].on_garbage do |wr|
                wr.db.cache.delete wr.pk
            end
            
            return new_node
        end
    end
end

So when a Node object is garbage collected, it's deleted from the cache. The cache stays clean of dead objects.

I'll be interested to hear what you think of this idea and how difficult it would be to implement it.


Related issues 1 (0 open1 closed)

Related to Ruby master - Feature #16038: Provide a public WeakMap that compares by equality rather than by identity ClosedActions
Actions

Also available in: Atom PDF

Like0
Like0Like0Like0Like0Like0