I'm trying to create word embeddings from an input text which has the full corpus of texts. I applied standard techniques and using the latest version of tensor flow. However I'm getting an error as following:
TypeError: Failed to convert object of type <class 'tensorflow.python.framework.sparse_tensor.SparseTensor'> to Tensor. Contents: SparseTensor(indices=Tensor("DeserializeSparse_1:0", shape=(None, 2), dtype=int64), values=Tensor("DeserializeSparse_1:1", shape=(None,), dtype=float32), dense_shape=Tensor("stack_1:0", shape=(2,), dtype=int64)). Consider casting elements to a supported type.
the code snippet is below
n_words = len(unique_word_dict)
# Getting all the unique words
words = list(unique_word_dict.keys())
# Creating the X and Y matrices using one hot encoding
X = []
Y = []
for i, word_list in tqdm(enumerate(word_lists)):
# Getting the indices
main_word_index = unique_word_dict.get(word_list[0])
context_word_index = unique_word_dict.get(word_list[1])
# Creating the placeholders
X_row = np.zeros(n_words)
Y_row = np.zeros(n_words)
# One hot encoding the main word
X_row[main_word_index] = 1
# One hot encoding the Y matrix words
Y_row[context_word_index] = 1
# Appending to the main matrices
X.append(X_row)
Y.append(Y_row)
# Converting the matrices into a sparse format because the vast majority of the data are 0s
X = sparse.csr_matrix(X)
Y = sparse.csr_matrix(Y)
# Defining the size of the embedding
embed_size = 2
# Defining the neural network
inp = Input(shape=(X.shape[1],))
x = Dense(units=embed_size, activation='linear')(inp)
x = Dense(units=Y.shape[1], activation='softmax')(x)
model = Model(inputs=inp, outputs=x)
model.compile(loss = 'categorical_crossentropy', optimizer = 'adam')
# Optimizing the network weights
model.fit(
x=X,
y=Y,
batch_size=256,
epochs=1000
)
question from:
https://stackoverflow.com/questions/65864681/how-to-fix-consider-casting-elements-to-a-supported-type-in-python 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…