Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Python by (45.3k points)

Given a 2D(M x N) matrix, and a 2D Kernel(K x L), how do i return a matrix that is the result of max or mean pooling using the given kernel over the image?

I'd like to use numpy if possible.

Note: M, N, K, L can be both even or odd and they need not be perfectly divisible by each other, eg: 7x5 matrix and 2x2 kernel.

eg of max pooling:

matrix:

array([[  20,  200,   -5,   23],

       [ -13,  134,  119,  100],

       [ 120,   32,   49,   25],

       [-120,   12,   09,   23]])

kernel: 2 x 2

soln:

array([[  200,  119],

       [  120,   49]])

1 Answer

0 votes
by (16.8k points)

You could use scikit-image block_reduce:

import numpy as np

import skimage.measure

a = np.array([

      [  20,  200,   -5,   23],

      [ -13,  134,  119,  100],

      [ 120,   32,   49,   25],

      [-120,   12,    9,   23]

])

skimage.measure.block_reduce(a, (2,2), np.max)

Gives:

array([[200, 119],

       [120,  49]])

Related questions

0 votes
4 answers
0 votes
1 answer
asked Mar 21, 2021 in Python by laddulakshana (16.4k points)
0 votes
1 answer
0 votes
2 answers
0 votes
1 answer
asked Jul 17, 2019 in Python by Sammy (47.6k points)

Browse Categories

...