79274793

Date: 2024-12-12 10:52:13
Score: 1
Natty:
Report link

Please have a look at this similar issue for your reference. There are some difference observed on this MultiHeadAttention layer implementation for the custom layer between Tensorflow 2.14 and TensorFlow 2.16 version.

However, you can refer to the Image captioning with visual attention example notebook for the MultiHeadAttention layer implementation in TransformerBlock.

Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (1):
Posted by: Jenny