I have a bunch of XML files in a Azure blob storage and want to copy the files to Azure data Lake into folders based on the filename of each file. Can this be done without for-each loop?The files are first downloaded from a sftp and placed in the blob storage to be processed by snowflake (here I do not need a folder structure, but it is ok if a solution requires it). I also want to keep a copy of each file in the data Lake and here I really need some sort of structure. As the filename has a date and time, I would like to use that date instead of the date the ADF Pipeline runs. To solve this error,
I have tried work around. Performed one copy activity considered the same situation as you mentioned source is blob storage.
Destination is datalake and in the dataset properties I have added the expression to perform the condition you asked for.
Use the below expression under dataset property.
@{formatDateTime(pipeline().parameters.windowStart,'yyyy')}?@{formatDateTime(pipeline().parameters.windowStart,'MMMM')}?@formatDateTime(pipeline().parameters.windowStart,'dddd')}
Output
Pipeline is running succesfully.