Introduction
I needed to find a way of running an import routine for all CSV files in a folder that have recently changed. To do this I created a Windows Powerscript that looked for files where the archive attribute had been set.(nb. The archive attribute is always set when a file is created or updated)
The basic process will be:-
- Get all files in folder (which have the .csv extension).
- Loop through looking for those with the archive attribute set.
- If found start a new log.
- Run my dataload process.
- If success remove archive bit.
- Write output to log.
The Powerscript
Here's what I came up with:-$path = "<path_to_files>" $files = Get-ChildItem -Path $path -filter "*.csv" $attribute = [io.fileattributes]::archive $newlog = 0 Foreach($file in $files) { If((Get-ItemProperty -Path $file.fullname).attributes -band $attribute) { if ($newlog -eq 0) { $LogTime = Get-Date -Format "dd/MM/yyyy hh:mm:ss" "Processing started: $LogTime" | Out-File import.log $newlog = 1 } "File: $file" | Out-File run_rt_forecast.log -Append $scriptOutput = &<my_external_process> $file.fullname 2>&1 if($?) { Set-ItemProperty -Path $file.fullname -Name attributes -Value ((Get-ItemProperty $file.fullname).attributes -BXOR $attribute) } Foreach($out in $scriptOutput) { "$out" | Out-File run_rt_forecast.log -Append } } } |
Having spent a lot of time working with Perl, I found it okay to use. There's quite a nice Windows PowerShell ISE which does a reasonable job of allowing you to develop and test in one place. I was also able to open my log file here, but it's a shame it doesn't offer to reload the file when it detects a change.
Gotcha's
Just a few things that caught me out..- Ensure you give your script name an extension of ".ps1".
- When testing, remember to use a Powershell not an ordinary Command Window.