I was trying to match the orthogonal polynomials in the following code in R:

X <- cbind(1, poly(x = x, degree = 9))

but in python.

To do this I implemented my own method for giving orthogonal polynomials:

def get_hermite_poly(x,degree):

#scipy.special.hermite()

N, = x.shape

##

X = np.zeros( (N,degree+1) )

for n in range(N):

for deg in range(degree+1):

X[n,deg] = hermite( n=deg, z=float(x[deg]) )

return X

though it does not seem to match it. Does someone know the type of orthogonal polynomial it uses? I tried a search in the documentation but didn't say.

To give some context I am trying to implement the following R code in python (https://stats.stackexchange.com/questions/313265/issue-with-convergence-with-sgd-with-function-approximation-using-polynomial-lin/315185#comment602020_315185):

set.seed(1234)

N <- 10

x <- seq(from = 0, to = 1, length = N)

mu <- sin(2 * pi * x * 4)

y <- mu

plot(x,y)

X <- cbind(1, poly(x = x, degree = 9))

# X <- sapply(0:9, function(i) x^i)

w <- rnorm(10)

learning_rate <- function(t) .1 / t^(.6)

n_samp <- 2

for(t in 1:100000) {

mu_hat <- X %*% w

idx <- sample(1:N, n_samp)

X_batch <- X[idx,]

y_batch <- y[idx]

score_vec <- t(X_batch) %*% (y_batch - X_batch %*% w)

change <- score_vec * learning_rate(t)

w <- w + change

}

plot(mu_hat, ylim = c(-1, 1))

lines(mu)

fit_exact <- predict(lm(y ~ X - 1))

lines(fit_exact, col = 'red')

abs(w - coef(lm(y ~ X - 1)))

because it seems to be the only one that works with gradient descent with linear regression with polynomial features.

I feel that any orthogonal polynomial (or at least orthonormal) should work and give a hessian with condition number 1 but I can't seem to make it work in python.