A lot of people talk about blender supporting other types of displacement mapping. (sphere, cube, torus... etc)
Though this has been done in python I think a good way for blender to do displacement mapping (mesh based) is to use the normal (bumpmap) of the texture to displace the vertex in the direction is facing (is this called a vertex normal?)
This could be done in a script but I don't know If the bumpmap value can be accessed from python. Anyone know?
python doesn't have access to the bumpmap
and tecnically bumpmapping changes the normals on each pixel, not the hieght of that vertex. One would have to find a way to go from normals to geometry changes. Not fun.
though when you use a greyscale image in blender you get bumps, like it is a height map. Blender appears to only for images convert height changes into new normals. other textures (cloud, linear, plugin...) do not work this way.
python can do displacement mapping, but it would do it seperate from the material's settings.