LRUCache in Scala?

lru cache
lru cache python
java lru cache
lrucache c++
lru cache o(1)
lock-free lru cache
146 lru cache
concurrent lru cache

I know Guava has an excellent caching library but I am looking for something more Scala/functional friendly where I can do things like cache.getOrElse(query, { /* expensive operation */}) . I also looked at Scalaz's Memo but that does not have lru expiration.

The Spray folks have a spray-caching module which uses Futures. There is a plain LRU version and a version that allows you to specify an explicit time to live, after which entries are expired automatically.

The use of Futures obviously allows you to write code that does not block. What is really cool, though, is that it solves the "thundering herds" problem as a bonus. Say, for example, that a bunch of requests come in at once for the same entry which is not in the cache. In a naive cache implementation, a hundred threads might get a miss on that entry in the cache and then run off to generate the same data for that cache entry, but of course 99% of that is just wasted effort. What you really want is for just one thread to go generate the data and all 100 requestors to see the result. This happens quite naturally if your cache contains Futures: the first requestor immediately installs a Future in the cache, so only the first requestor misses. All 100 requestors get that Future for the generated result.

How to write a (completely lock-free) concurrent LRU Cache with , LRUCache.scala. import akka.stm._. import scala.collection.immutable.ListMap. case class LRUCache[A, B](private val MAX_ENTRIES: Int). {. protected val� spray / spray-caching / src / main / scala / spray / caching / LruCache.scala. Find file Copy path jtescher Update copyright year to 2015 88d6acd Mar 9, 2015.

Found exactly what I was looking in Twitter's Scala Util library

LRU cache implementation in Scala � GitHub, import akka.stm._. import scala.collection.immutable.ListMap. case class LRUCache[A, B](private val MAX_ENTRIES: Int). {. protected val cache� jeffreyolchovy / LRUCache.scala. Created Aug 6, 2012. Star 4 Fork 2 Code Revisions 2 Stars 4 Forks 2. Embed. What would you like to do? Embed

LRUCache solution based on Java LinkedHashMap and exposed as Scala mutable.Map

import java.util.Collections.synchronizedMap

import scala.collection.JavaConversions._
import scala.collection.mutable

class LRUCache[K, V](maxEntries: Int)
  extends java.util.LinkedHashMap[K, V](100, .75f, true) {

  override def removeEldestEntry(eldest: java.util.Map.Entry[K, V]): Boolean 
     = size > maxEntries

}

object LRUCache {
  def apply[K, V](maxEntries: Int): mutable.Map[K, V] 
    = synchronizedMap(new LRUCache[K, V](maxEntries))
}

When map size > maxEntries last recent used entry will be removed.

LinkedHashMap 3rd constructor parameter should be set as true for LRU strategy. LinkedHashMap(int initialCapacity, float loadFactor, boolean accessOrder)

Usage example:

val cache = LRUCache[String, Int](1000)
val key = "key1"
val value = 111

cache.get(key) shouldBe None
cache += key -> value
cache.get(key) shouldBe Some(value)
cache(key) shouldBe value

LRUCache.scala � GitHub, LRUCache. object LRUCache extends AnyRef. "map" is the backing store used to hold key Source: LRUCache.scala. Linear Supertypes. AnyRef, Any. LRUCache in Scala? 4. scala 2.10 type mismatch using google guava's CacheBuilder. 8. Build-automation - sbt: Compile/Test against multiple dependencies. Related. 312.

com.twitter.storehaus.cache.LRUCache, AlreadyCompletedFuture, Future} object LruCache { /** * Creates a new LruCache instance */ def apply[V](maxEntries: Int = 500, dropFraction: Double = 0.20,� LRUCache in Scala? I know Guava has an excellent caching library but I am looking for something more Scala/functional friendly where I can do things like cache.getOrElse(query, { /* expensive operation */}) .

cc/spray/caching/LruCache.scala, /spray-caching/src/main/scala/spray/caching/LruCache.scala Scala | 189 lines | 117 code | 21 blank | 51 comment | 14 complexity� And because ZIO effects are monads (meaning they have map and flatMap methods), we can combine the results of calling Ref.make using for-comprehension syntax, for yielding a new LRUCache. Now, we can implement the get and put methods for the LRUCache. Let’s start with the get method first:

Caching • Akka HTTP, LRU Cache Implementation. 28-09-2012. How to implement LRU caching scheme? What data structures should be used? We are given total possible page � Output: Time taken to execute the function without lru_cache is 0.4448213577270508 Time taken to execute the function with lru_cache is 2.8371810913085938e-05

Comments
  • That question is really close to this one
  • What is the difference between using cache.get(Key, Callable<Value>) (see api doc) and what you want to do? Is it just that you want to pass a function instead of a Callable?
  • Yes, I want to pass a function instead of callable. And I want all the functional goodies of Map.scala