MemoryStore

MemoryStore manages blocks (in the internal entries registry).

MemoryStore requires SparkConf, BlockInfoManager, SerializerManager, MemoryManager and BlockEvictionHandler to be created.

Caution
FIXME Where are these dependencies used?
Caution
FIXME Where is the MemoryStore created? What params provided?
Note
MemoryStore is a private[spark] class.
Tip

Enable INFO or DEBUG logging level for org.apache.spark.storage.memory.MemoryStore logger to see what happens inside.

Add the following line to conf/log4j.properties:

log4j.logger.org.apache.spark.storage.memory.MemoryStore=DEBUG

Refer to Logging.

entries Registry

entries is Java’s LinkedHashMap with the initial capacity of 32, the load factor of 0.75 and access-order ordering mode (i.e. iteration is in the order in which its entries were last accessed, from least-recently accessed to most-recently).

Note
entries is Java’s java.util.LinkedHashMap.

putBytes

putBytes[T: ClassTag](
  blockId: BlockId,
  size: Long,
  memoryMode: MemoryMode,
  _bytes: () => ChunkedByteBuffer): Boolean

putBytes requests size memory for the blockId block from the current MemoryManager. If successful, it registers a SerializedMemoryEntry (with the input _bytes and memoryMode) for blockId in the internal entries registry.

You should see the following INFO message in the logs:

INFO Block [blockId] stored as bytes in memory (estimated size [size], free [bytes])

putBytes returns true after putBytes stored blockId.

Evicting Blocks to Free Space

Caution
FIXME

Removing Block

Caution
FIXME

Settings

spark.storage.unrollMemoryThreshold

spark.storage.unrollMemoryThreshold (default: 1024 * 1024) controls…​

results matching ""

    No results matching ""