Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
413 views
in Technique[技术] by (71.8m points)

python - Fastest way to store a numpy array in redis

I'm using redis on an AI project.

The idea is to have multiple environment simulators running policies on a lot of cpu cores. The simulators write experience (a list of state/action/reward tuples) to a redis server (replay buffer). Then a training process reads the experience as a dataset to generate a new policy. New policy is deployed to the simulators, data from previous run is deleted, and the process continues.

The bulk of the experience is captured in the "state". Which is normally represented as a large numpy array of dimension say, 80 x 80. The simulators generate these as fast as the cpu will allow.

To this end, does anyone have good ideas or experience of the best/fastest/simplest way to write a lot of numpy arrays to redis. This is all on the same machine, but later, could be on a set of cloud servers. Code samples welcome!

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

I don't know if it is fastest, but you could try something like this...

Storing a Numpy array to Redis goes like this - see function toRedis():

  • get shape of Numpy array and encode
  • append the Numpy array as bytes to the shape
  • store the encoded array under supplied key

Retrieving a Numpy array goes like this - see function fromRedis():

  • retrieve from Redis the encoded string corresponding to supplied key
  • extract the shape of the Numpy array from the string
  • extract data and repopulate Numpy array, reshape to original shape

#!/usr/bin/env python3

import struct
import redis
import numpy as np

def toRedis(r,a,n):
   """Store given Numpy array 'a' in Redis under key 'n'"""
   h, w = a.shape
   shape = struct.pack('>II',h,w)
   encoded = shape + a.tobytes()

   # Store encoded data in Redis
   r.set(n,encoded)
   return

def fromRedis(r,n):
   """Retrieve Numpy array from Redis key 'n'"""
   encoded = r.get(n)
   h, w = struct.unpack('>II',encoded[:8])
   # Add slicing here, or else the array would differ from the original
   a = np.frombuffer(encoded[8:]).reshape(h,w)
   return a

# Create 80x80 numpy array to store
a0 = np.arange(6400,dtype=np.uint16).reshape(80,80) 

# Redis connection
r = redis.Redis(host='localhost', port=6379, db=0)

# Store array a0 in Redis under name 'a0array'
toRedis(r,a0,'a0array')

# Retrieve from Redis
a1 = fromRedis(r,'a0array')

np.testing.assert_array_equal(a0,a1)

You could add more flexibility by encoding the dtype of the Numpy array along with the shape. I didn't do that because it may be the case that you already know all your arrays are of one specific type and then the code would just be bigger and harder to read for no reason.

Rough benchmark on modern iMac:

80x80 Numpy array of np.uint16   => 58 microseconds to write
200x200 Numpy array of np.uint16 => 88 microseconds to write

Keywords: Python, Numpy, Redis, array, serialise, serialize, key, incr, unique


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...