You can often directly access the tokenizer from the pipe and call it with your string to get the attention mask:
>>> pipe.tokenizer("Blah blah blah.")
{'input_ids': [101, 27984, 27984, 27984, 1012, 102], 'attention_mask': [1, 1, 1, 1, 1, 1]}
>>> pipe.tokenizer("Blah blah blah.")['attention_mask']
{'attention_mask': [1, 1, 1, 1, 1, 1]}
But even if that's not an option, it looks like you have access to the tokenizer at initialization. Why not use that directly?