TensorFlow:双射器中的变量无法重用

时间:2018-09-07 10:05:27

标签: tensorflow scope

描述问题

我试图通过将MaskedAutoregressiveFlow双射器内的神经网络中的权重和偏差重用,方法是将其与tf.variable_scope放在reuse=tf.AUTO_REUSE中。但是发现权重和偏差在实践中并没有重复使用。

复制

import tensorflow as tf
from tensorflow.contrib.distributions.python.ops import bijectors as tfb

def get_bijector(name='my_bijector', reuse=None):
  """Returns a MAF bijector."""
  with tf.variable_scope(name, reuse=reuse):
    shift_and_log_scale_fn = \
        tfb.masked_autoregressive_default_template([128])
    return tfb.MaskedAutoregressiveFlow(shift_and_log_scale_fn)

x = tf.placeholder(shape=[None, 64], dtype='float32', name='x')

bijector_0 = get_bijector(reuse=tf.AUTO_REUSE)
y_0 = bijector_0.forward(x)

bijector_1 = get_bijector(reuse=tf.AUTO_REUSE)
y_1 = bijector_1.forward(x)

# We were expecting that the `y_0` and `y_1` share the same dependent variables,
# since we used `tf.AUTO_REUSE` within the `tf.variable_scope`. However, the following
# will return a `False`.
print(get_dependent_variables(y_0) == get_dependent_variables(y_1))

其中我们采用了获取张量所依赖的所有变量的函数:

import collections

def get_dependent_variables(tensor):
  """Returns all variables that the tensor `tensor` depends on.

  Forked from: https://stackoverflow.com/a/42861919/1218716

  Args:
    tensor: Tensor.

  Returns:
    List of variables.
  """  
  # Initialize
  starting_op = tensor.op
  dependent_vars = []
  queue = collections.deque()
  queue.append(starting_op)
  op_to_var = {var.op: var for var in tf.trainable_variables()}
  visited = {starting_op}

  while queue:
    op = queue.popleft()
    try:
      dependent_vars.append(op_to_var[op])
    except KeyError:
      # `op` is not a variable, so search its inputs (if any). 
      for op_input in op.inputs:
        if op_input.op not in visited:
          queue.append(op_input.op)
          visited.add(op_input.op)

  return dependent_vars

0 个答案:

没有答案