# Chandan Rajpurohit

An Artist With Technical Skills

Ridge regression was originally developed to deal with the problem of having more feature than data point.

It can also be used to add bias into our estimations, giving better estimate.

Shrinkage methods allow us to throw out unimportant parameters so that we can get a better feel and understanding of the data. It can also generate better prediction value than linear regression.

``````def ridgeRegression(mat1,mat2,lamb=0.2):
trans1 = mat1.T*mat1
deno  = mat1 + eye(shape(mat1))*lamb
if linalg.det(deno) == 0.0:
print "singular matrix, cannot do inverse"
return
w=deno.I*(mat1.T*mat2)
return w``````

This function, implements ridge regression for any given value of lambda. Default value is 0.2.

``````def test (arr1,arr2):
mat1 = mat(arr1)
mat2 = mat(arr2).T
mean1 = mean(mat2,0)
mat2 = mat2-mean1
means2 = mean(mat1,0)
var1 = var(mat1,0)
mat1 = (mat1-means2)/var1
testPoints = 20
wmat = zeroes((testPoints,shape(mat1)))
for i in range(testPoints):
w = ridgeRegression(mat1,mat2,exp(i-10))
wmat[i,:] = w.T
return wmat``````

This tests over a number of lambda values.

Hope it helps !

Keep Learning 🙂