Hello. I have csv data from a datalogger that collected the info every second rather than every 15minutes and I made a script to export every 900th entry. The script works for smaller csv files (up to 80mb). But I have one file at 3.6GB, and it doesn't
work.
I looked online and found better methods to increase the speed (don't have .net, and haven't been able to get stream.reader to work).
Here is the script:
$file = Import-Csv z:\csv\input_file.csv -Header A,B,C,D,E,F
$counter = 0
ForEach ($item in $file)
{
$counter++
If($counter -lt 900)
{
}
Else{
Write-Output “$item” | Out-File "z:\csv\output_file.csv" -Append
$counter=0
}
}
Any ideas/optimaizations are greatly appreciated.
Thanks.