With any Ruby code under load, it's important to have a proper understanding of what's happening in memory -- if adding memoisation is yielding significant gains, there's likely much more potential for optimisation
Many in the Ruby community have taken a 1974 quote from Donald Knuth out of context and used it as a justification for minimal or zero effort on memory efficiency
The article is a very interesting read, but implementation is heavily reliant on metaprogramming magic that can be something of a grenade in realworld projects especially if anyone fairly new to Ruby has to fix it later
Almost none of the gems I used 10 years ago still exist today -- taking a small penalty for hand-rolling initially will usually save days or more later -- the author even says a similar earlier Gem he built is long gone
This was a pretty interesting read, especially just recently having read an old implementation of caching as a context manager in python by a colleague.
I'm a little unclear on the strong/weak references part - for a memoization scenario, the cached value shouldn't really be getting garbage collected _at all_, until it gets evicted, which typically would be managed by some lru/fifo/etc. strategy with size caps. So, where do strong/weak references come into play ?
Can't mention Fibonacci and memoization in the same sentence without me breaking out my favorite Python party trick:
def fib(n, cache = {0: 0, 1: 1}):
if n not in cache:
cache[n] = fib(n-1) + fib(n-2)
return cache[n]
The directly translated ruby version (from stack overflow of course) is even shorter:
def fib(n, cache=Hash.new{ |h,k| h[k] = k < 2 ? k : h[k-1] + h[k-2] })
cache[n]
end
It runs out of stack around 7146 on my machine at least. The python one is limited by the recursion depth limit in my test but of course that's configurable at your own risk.
If you're first going to golf it, endless-def:
def fib(n, cache=Hash.new{ |h,k| h[k] = k < 2 ? k : h[k-1] + h[k-2] }) = cache[n]
It's actually kind of ungolfed. The default version would be just
fib = Hash.new{ |h,k| h[k] = k < 2 ? k : h[k-1] + h[k-2] }
fib[7145]
RUBY_THREAD_VM_STACK_SIZE for Ruby
In real code I've only actually seen memoization used as a cache. That leaves cache validation to some higher level which is in a poor position to do it very well. In my experience, most often it means someone has to figure out what things to manually restart, and when.
I see no mention of object shape optimization and why and how excessive rose memoization (or just instance variables in general) can be expensive.
I've come across memoize gems before and thought, "do I really need to introduce a dependency when I can save results into a hash", but this makes a convincing argument to leverage a gem like memo_wise. Clean DSL syntax and the perf gains are much nicer than micromanaging it.
memo_wise maintainer here. No pressure from me to try it out, but if you do use it and have any feedback or suggestions please let us know!
> My preferred approach to [thread safety] is to explicitly not think about thread safety, and delegate this concern to the caller.
Good call IMO. If I was the author then I’d implement this; it would mostly work for current use cases, would be a pain to maintain and impossible to extend. I learned something today.
There are only 1 hard problems in computer science: memoization and ... shoot what was it again?
It’s fun how the runtime in seconds for the naive Fibonacci is
36: 14930352 (1.2s)
37: 24157817 (2.0s)
38: 39088169 (3.2s)
39: 63245986 (5.2s)
40: 102334155 (8.3s)
41: 165580141 (13.5s)
42: 267914296 (21.8s)
What a familiar pattern in those runtimes!
I know it’s pointless but now I’d like to see a fibonacci implementation that returns these runtimes as the result.
That's a neat observation! But isn't it for the naïve reason that, with negligible overhead, the run time of executing `fib(n) = fib(n - 1) + fib(n - 2)` is the run time of executing `fib(n - 1)`, plus the run time of executing `fib(n - 2)`?
On the other hand, if someone had asked me for an estimate on the run time, I probably would have tried breaking out the master theorem instead of thinking of this, so I don't want to downplay the observation.
Fibonacci never stops surprising!
With all that it borrowed from Perl, I'm surprised Ruby didn't also borrow the defined-or operator `//=` (https://perldoc.perl.org/perlop#Logical-Defined-Or). (Or, if it did, I guess the author hadn't encountered it.)
I don't think Ruby has this natively, but Rails adds it with Object#presence. https://api.rubyonrails.org/classes/Object.html#method-i-pre...
ActiveSupport adds tons of great convenience methods to Ruby and you can require it even outside of Rails! blank, present; date and time conversions like 1.month.from_now; except; enumerable methods like pluck... It's just lovely
require 'active_support/all'
https://guides.rubyonrails.org/v5.2/active_support_core_exte...
Not sure I understand this example from Perl but Ruby has ||= that checks for nil but not defined.
You can do
a ||= b
And if a is nil then it will get the value from b else it will remain with the current value