79397448

Date: 2025-01-29 16:29:51
Score: 0.5
Natty:
Report link

In the first script, if all the "dummy" files are the same, just have the script create one file at the begining instead of inside the foreach loop. Then replace that line and just copy that one file into the new location determined by the foreach loop. That would save time on that one.

For the "slow script", Why are you creating your own timer instead of piping the script to Measure-Command?

  1. Save the script and remember the full path and name
  2. .\Mypath\Myscript.ps1 | Measure-Command

Read the amount of time Measure-Command reports.

Instead of the -Filter paramter have you tried it like this to see if that effects performance or not?

$TestDir = "$($env:USERPROFILE)\TestFiles"
# Get the list of files
Get-ChildItem -Path "$TestDir\*.txt" | ForEach-Object {
    # Simulate some processing
    Get-Content $_.FullName | Out-Null
}
Reasons:
  • Whitelisted phrase (-1): have you tried
  • Long answer (-0.5):
  • Has code block (-0.5):
  • Ends in question mark (2):
  • Low reputation (0.5):
Posted by: Vern_Anderson