Skip to content

Hi, I'm just wondering why you coded like this. #60

@seolhokim

Description

@seolhokim

Please make sure that this is a Bug or a Feature Request and provide all applicable information asked by the template.
If your issue is an implementation question, please ask your question on StackOverflow or on the Keras Slack channel instead of opening a GitHub issue.

System information

  • Have I written custom code (as opposed to using example directory):
  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04):
  • TensorFlow backend (yes / no): yes
  • TensorFlow version: 1.4
  • Keras version: 2.2
  • Python version: 3.7
  • CUDA/cuDNN version: 10/forgot
  • GPU model and memory: 16gb

You can obtain the TensorFlow version with:
python -c "import tensorflow as tf; print(tf.GIT_VERSION, tf.VERSION)"
You can obtain the Keras version with:
python -c 'import keras as k; print(k.version)'

Describe the current behavior
In keras/keras/layers/attention.py, line 192
matmul = K.batch_dot(queries_, K.permute_dimensions(keys_, (0, 2, 1)), axes=[2, 1])

Describe the expected behavior
matmul = K.batch_dot(queries_, keys_, , axes=[2, 2])
or
matmul = tf.einsum('ijk,iak->ija',queries_,keys_)
Code to reproduce the issue
Provide a reproducible test case that is the bare minimum necessary to generate the problem.

Other info / logs
Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions