python-将Keras中嵌入层输出的形状从3D减少到2D
发布时间:2022-05-21 03:31:49 300
相关标签:
我有一个数据集,其中每个示例有10个句子(config.max\u路径),每个句子有10个单词(config.max\u路径长度)。我正在做一个基于单词的嵌入。下面是代码:
def create_model(vocab_size):
config = Config()
path_input = Input((config.max_paths,config.max_path_length), dtype=tf.int32)
#embedding layer
nodes_embedded = Embedding(vocab_size+2, config.embedding_size, trainable = True, name='node_embedding')(path_input)
# path embeddings from node embeddings
# convert nodes_embedded to path embeddings to get paths_embedded
subtree_vectors = TimeDistributed(
Dense(config.embedding_size, use_bias=False, activation='tanh'))(nodes_embedded)
attention_vectors = Dense(1,)(subtree_vectors)
attention_weights = Softmax()(attention_vectors)
code_vectors = K.sum(subtree_vectors * attention_weights, axis=1)
output_class = Dense(config.num_classes, use_bias=False, activation='softmax')(code_vectors)
model = Model(inputs=path_input, outputs=output_class)
return model
当前的模型摘要:
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_6 (InputLayer) [(None, 15, 10)] 0 []
node_embedding (Embedding) (None, 15, 10, 10) 690 ['input_6[0][0]']
time_distributed_5 (TimeDistri (None, 15, 10, 10) 100 ['node_embedding[0][0]']
buted)
dense_14 (Dense) (None, 15, 10, 1) 11 ['time_distributed_5[0][0]']
softmax_4 (Softmax) (None, 15, 10, 1) 0 ['dense_14[0][0]']
tf.math.multiply_4 (TFOpLambda (None, 15, 10, 10) 0 ['time_distributed_5[0][0]',
) 'softmax_4[0][0]']
tf.math.reduce_sum_4 (TFOpLamb (None, 10, 10) 0 ['tf.math.multiply_4[0][0]']
da)
dense_15 (Dense) (None, 10, 2) 20 ['tf.math.reduce_sum_4[0][0]']
我想对单词嵌入进行一些汇总或求和,以获得句子嵌入的简化形状,其中模型可能如下所示:
Layer (type) Output Shape
==================================================================================================
input_6 (InputLayer) [(None, 15, 10)]
node_embedding (Embedding) (None, 15, 10, 10)
path_embedding (None, 15, 10) (Want to add this layer somehow)
time_distributed_5 (TimeDistri (None, 15, 10)
buted)
dense_14 (Dense) (None, 15, 1)
softmax_4 (Softmax) (None, 15, 1)
tf.math.multiply_4 (TFOpLambda (None, 15, 10)
)
tf.math.reduce_sum_4 (TFOpLamb (None, 10)
da)
dense_15 (Dense) (None, 2)
我该怎么做?
特别声明:以上内容(图片及文字)均为互联网收集或者用户上传发布,本站仅提供信息存储服务!如有侵权或有涉及法律问题请联系我们。
举报