PositionEmbedding Dimension Error #1581
Unanswered
Joe-Sedlacek
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi there,
Any help on this issue is greatly appreciated. I have been following the source code for keras_nlp.layers.PositionEmbedding() (tutorial found here https://keras.io/api/keras_nlp/modeling_layers/position_embedding/), but I am coming across an error which I am having trouble debugging. The error reads,
TypeError: Exception encountered when calling PositionEmbedding.call().
Dimension value must be integer or None or have an index method, got value '<attribute 'shape' of 'numpy.generic' objects>' with type '<class 'getset_descriptor'>'
Arguments received by PositionEmbedding.call():
• inputs=tf.Tensor(shape=(None, 256, 128), dtype=float32)
• start_index=0
The code that triggers this error is below:
inputs = layers.Input((config.MAX_LEN,), dtype="int64") word_embeddings = keras.layers.Embedding( config.VOCAB_SIZE, config.EMBED_DIM, name="word_embedding" )(inputs) position_embeddings = keras_nlp.layers.PositionEmbedding( sequence_length=config.MAX_LEN, )(word_embeddings) embeddings = word_embeddings + position_embeddings
config.MAX_LEN is an integer (I have checked its type), and the word_embeddings have shape (None, 256, 128), so I think they are getting created correctly?
Again, any help is much appreciated!
Beta Was this translation helpful? Give feedback.
All reactions