same issues is faced by me. Use below code to reslove this issues:
from tensorflow.keras.layers import Embedding, LSTM, Dropout, Dense
from tensorflow.keras.models import Sequential
# Ensure total_word and max_seq are defined correctly
model = Sequential()
model.add(Embedding(input_dim=total_word, output_dim=100, input_length=max_seq - 1))
model.build((None,max_seq)) # build the Embedding to inilizices the weight
model.add(LSTM(150, return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM(100))
model.add(Dense(total_word, activation='softmax'))
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
model.summary()
Reason: The reason our model works after adding model.build((None, max_seq)) is that the Embedding layer requires the input shape to be defined before it can initialize its weights. By calling model.build((None, max_seq)), you explicitly define the input shape, allowing the Embedding layer to initialize its weights properly.
In Keras, layers like Embedding can be added to a model without specifying the input shape. However, the layer's weights are not initialized until the model's input shape is known. Calling model.build() with the input shape as an argument triggers the weight initialization process. This is particularly useful when the input shape is dynamic or not known at the time of model definition.