79760517

Date: 2025-09-10 06:25:20
Score: 1
Natty:
Report link

What’s happening is that you’re not actually using a Dense layer the way you might expect from a 1D vector setting (e.g., after a Flatten).

How Keras Dense really works

In Keras/TensorFlow, Dense is implemented as a matrix multiplication between the last dimension of the input and the layer’s weight matrix. It does not require you to flatten the entire input tensor, nor does it care about the other dimensions.

If the input has shape (batch, H, W, C), a Dense(units=64) layer just takes the last axis C and produces output (batch, H, W, 64).

Internally, TensorFlow broadcasts the weight multiplication over all other dimensions (H and W here).

That’s why you don’t get an error: your inputs have shapes like (batch, 1, T, 64), and Dense just treats (1, T) as “batch-like” dimensions that it carries along.

Why this allows dynamic input sizes

Because the Dense operation is applied pointwise along all non-last dimensions, it doesn’t matter whether T = 12 or T = 32. The only requirement is that the channel dimension (C) is fixed, since that’s what the weight matrix expects. The temporal dimension (T) can vary freely.

So in your example:

Input: (12, 1, 32, 64) → Dense(64) → (12, 1, 32, 64)
Input: (17, 1, 12, 64) → Dense(64) → (17, 1, 12, 64)

Both work fine because Dense is applied independently at each (batch, 1, time) location.

Contrast with pooling or flattening

If you had tried to do Flatten → Dense, then yes, you would need a fixed time dimension, because flattening collapses everything into a single vector.

But using Dense “in place” like this behaves more like a 1x1 Conv2D: it remaps features without collapsing spatial/temporal dimensions.

TL;DR

You’re not getting an error because Dense in Keras is defined to operate on the last axis only, broadcasting across all other axes. It’s essentially equivalent to applying a 1x1 Conv2D across the feature dimension. That’s why variable-length time dimensions are supported automatically in your setup.

Reasons:
  • Long answer (-1):
  • No code block (0.5):
  • Starts with a question (0.5): What
  • Low reputation (1):
Posted by: Leo Alcaraz Jr.