As sascha observes, constant initial weights aren’t a solution in general anyway because you have to break symmetry. Better solution for the particular context in which I came across the problem: a random number generator that gives the same sequence regardless of type.
dtype = np.float64
# Random number generator that returns the correct type
# and returns the same sequence regardless of type
def rnd(shape=(), **kwargs):
if type(shape) == int or type(shape) == float:
shape = shape,
x = tf.random_normal(shape, **kwargs, dtype=np.float64)
if dtype == np.float32:
x = tf.to_float(x)
return x.eval()
solved Default initial value for weights