79606302

Date: 2025-05-05 04:09:39
Score: 0.5
Natty:
Report link

I have a bunch of XML files in a Azure blob storage and want to copy the files to Azure data Lake into folders based on the filename of each file. Can this be done without for-each loop?The files are first downloaded from a sftp and placed in the blob storage to be processed by snowflake (here I do not need a folder structure, but it is ok if a solution requires it). I also want to keep a copy of each file in the data Lake and here I really need some sort of structure. As the filename has a date and time, I would like to use that date instead of the date the ADF Pipeline runs. To solve this error,

I have tried work around. Performed one copy activity considered the same situation as you mentioned source is blob storage.

enter image description here Destination is datalake and in the dataset properties I have added the expression to perform the condition you asked for.

enter image description here

Use the below expression under dataset property.

@{formatDateTime(pipeline().parameters.windowStart,'yyyy')}?@{formatDateTime(pipeline().parameters.windowStart,'MMMM')}?@formatDateTime(pipeline().parameters.windowStart,'dddd')} 

enter image description here

Output

enter image description here

Pipeline is running succesfully.

enter image description here

Reasons:
  • Probably link only (1):
  • Long answer (-1):
  • Has code block (-0.5):
  • Contains question mark (0.5):
  • Low reputation (0.5):
Posted by: Shraddha Pore