locked
Powershell to change sample rate CSV file from data logger RRS feed

  • Question

  • Hello. I have csv data from a datalogger that collected the info every second rather than every 15minutes and I made a script to export every 900th entry. The script works for smaller csv files (up to 80mb). But I have one file at 3.6GB, and it doesn't work.

    I looked online and found better methods to increase the speed (don't have .net, and haven't been able to get stream.reader to work).

    Here is the script:

    $file = Import-Csv z:\csv\input_file.csv -Header A,B,C,D,E,F
    $counter = 0
    ForEach ($item in $file)
    {

    $counter++
    If($counter -lt 900)
    {

    }

    Else{
    Write-Output “$item” | Out-File "z:\csv\output_file.csv" -Append

    $counter=0
    }

    }

    Any ideas/optimaizations are greatly appreciated.

    Thanks.

    • Moved by Bill_Stewart Monday, October 2, 2017 9:59 PM This is not "fix/debug/rewrite my script for me" forum
    Tuesday, August 29, 2017 6:23 PM

All replies

  • You have .net because PowerShell  IS .Net.

    Get-Content z:\csv\input_file.csv  -ReadCount 9000|%{$_[-1]}

    Displays the last record of each block of 9000 lines.  It will be faster than Import-CSV and will run against any size file because it only loads the current 9000 lines into memory.

    You can get a stream reader like this.

    $fi = Get-Item z:\csv\input_file.csv
    $s = $fi.OpenRead()
    $streamreader = [io.streamreader]::new($s)
    $streamreader.ReadLine()


    \_(ツ)_/




    • Edited by jrv Tuesday, August 29, 2017 6:53 PM
    Tuesday, August 29, 2017 6:50 PM