Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in AI and Deep Learning by (50.2k points)

I'm using Redis on an AI project.

The idea is to have multiple environment simulators running policies on a lot of CPU cores. The simulators write experience (a list of state/action/reward tuples) to a redis server (replay buffer). Then a training process reads the experience as a dataset to generate a new policy. The new policy is deployed to the simulators, data from the previous run is deleted, and the process continues.

The bulk of the experience is captured in the "state". Which is normally represented as a large numpy array of dimensions say, 80 x 80. The simulators generate these as fast as the CPU will allow.

To this end, does anyone have good ideas or experience of the best/fastest/simplest way to write a lot of numpy arrays to Redis? This is all on the same machine, but later, it could be on a set of cloud servers. Code samples welcome!

1 Answer

0 votes
by (107k points)

To apply Redis with Python, we will need a Python Redis client. 

For storing a Numpy array to Redis goes like this - see function toRedis():

  1. get a shape of Numpy array and encode

  2. append the Numpy array as bytes to the shape

  3. store the encoded array undersupplied key

You can refer to the following code: 

import struct

import redis

import numpy as np

 

def toRedis(r,a,n):

"""Store a given Numpy array 'a' in Redis under key 'n'"""

   h, w = a.shape

   shape = struct.pack('>II',h,w)

   encoded = shape + a.tobytes()

   # Store encoded data in Redis

   r.set(n,encoded)

   return

Related questions

0 votes
1 answer
asked Aug 5, 2019 in Python by Sammy (47.6k points)
0 votes
1 answer
0 votes
1 answer
0 votes
1 answer
asked Sep 27, 2019 in Python by Sammy (47.6k points)

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...