79427550

Date: 2025-02-10 15:30:53
Score: 2
Natty:
Report link

If your span lengths are equal, we can do this:

idx = torch.arange(tokens.size(1)).unsqueeze(0)
mask = (start_index.unsqueeze(1) <= idx) & (idx < end_index.unsqueeze(1) )
spans = tokens[mask].view(tokens.size(0), -1)

otherwise we cannot store it in a single tensor as @dennlinger mentioned

Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • User mentioned (1): @dennlinger
  • Low reputation (1):
Posted by: huangxt233