none
How to add creation of multiple files and script block to this script RRS feed

  • Question

  • Hi Experts,

    I have a script where it collects logs from security center online and based on the limit configured ( 50000 rows/entries per file)

    But we need to run it multiple times to get the entire data for a period of 10 days, as the volume of data is more, and column restriction for every run is limited to 5000 by default in power shell export.

    Plese guide how can we loop this file export by providing entire 10 days date limit and create multiple files if the file size or if the number of rows reach readable limit ( Ex: 50 MB or 50000 rows) 

    In the mean time should not miss any columns also

    Current Script:

    =============

    $startdate = "7/14/2018 10:49 AM"
    $enddate   = "7/14/2018 12:49 PM"
    $outfile   = "C:\temp\24-07-2018-6.csv"
    $sessionid = "Audit_Report"
    #INPUT#
    $a=Search-UnifiedAuditLog -StartDate $startdate -EndDate $enddate 
    write-host "Total Records Found:" $a[0].ResultCount -ForegroundColor Yellow
    pause
    $size = 5000
    [int]$i = ($a[0].ResultCount/$size)+1
    $n = 0
    do
        {
            Search-UnifiedAuditLog -StartDate $startdate -EndDate $enddate  -Operations "MailboxLogin"  -ResultSize $size -SessionId $sessionid -SessionCommand ReturnLargeSet | Select-Object CreationDate,UserIds,AuditData | Export-Csv $outfile -Append
            $n = $n + 1
            Write-Host "++++++++++++++  $n  +++++++++++++++" -ForegroundColor Cyan
        }
    until($n -eq $i)

    ========

    This one exports a csv file of 50000 rows and stops.

    Requirement:

    I need to give the date range to may be 10 days or more ( Now if give, it takes only the first 50000 and stops because there are lakhs of entries)

    If the data is crossing the limit, it need to create new file

    Please help


    Regards, Manju Gowda

    • Moved by Bill_Stewart Tuesday, December 11, 2018 10:20 PM This is not "scripts on demand"
    Wednesday, July 25, 2018 12:12 PM

All replies

  • There is no 50000 record limit on a CSV file.  The file can have millions or records.


    \_(ツ)_/

    Wednesday, July 25, 2018 12:41 PM
  • Hi Jrv,

    Agree the limit is not form csv, it is from the powershell record pulling capability for this perticualr audit log.

    It does not give you more than 5000 records per export, as the data is huge it is again not exporting may be due throttling 


    Regards, Manju Gowda

    Wednesday, July 25, 2018 12:46 PM
  • The maximum number of records allowed is 50000 when using " ReturnLargeSet".  To get more data use a new date range and a new sessionID.

    The range you are specifying cannot possible return more than 50000 records. It is only 2 hours.

    $startdate = "7/14/2018 10:49 AM"
    $enddate   = "7/14/2018 12:49 PM"


    \_(ツ)_/

    Wednesday, July 25, 2018 12:53 PM
  • This will attempt to get up to 50000 records.

    1..10 | 
        ForEach-Object{
            Search-UnifiedAuditLog -StartDate $startdate -EndDate $enddate -ResultSize 5000 -SessionId "Test $startdate - $enddate" -SessionCommand  ReturnLargeSet
        } |
    Export-Csv  $outfile


    \_(ツ)_/


    • Edited by jrv Wednesday, July 25, 2018 12:58 PM
    Wednesday, July 25, 2018 12:57 PM
  • quickest thing i can think of is have the script chunk the query into a daily breakdown then run it through the do while you have there.
    Wednesday, July 25, 2018 12:58 PM
  • quickest thing i can think of is have the script chunk the query into a daily breakdown then run it through the do while you have there.

    Yes.  Decide what count is less than 50000 for a date range like a day or a week and chunk it by that interval. Use even day boundaries.

    $enddate = [datetime]::Today.AddDays(-1)
    $startdate = $enddate.AddDays(-7)

    This will give an exact 7 day interval midnight to midnight.


    \_(ツ)_/



    • Edited by jrv Wednesday, July 25, 2018 1:04 PM
    Wednesday, July 25, 2018 1:01 PM
  • Here is an example of how to work with time ranges in PowerShell.

    # get 7 days one day at a time
    $enddate = [datetime]::Today.AddDays(-1)
    $startdate = $enddate.AddDays(-7)
    1..7 |
        ForEach-Object{
            1..10 | 
                ForEach-Object{
                    Search-UnifiedAuditLog -StartDate $startdate -EndDate $enddate -ResultSize 5000 -SessionId "Test $startdate - $enddate" -SessionCommand  ReturnLargeSet
                }
            # increment dates
            $startdate = $startdate.AddDays(1)
            $enddate = $enddate.AddDays(1)
        } | Export-Csv $outfile -NoType


    \_(ツ)_/


    • Edited by jrv Wednesday, July 25, 2018 1:13 PM
    Wednesday, July 25, 2018 1:13 PM
  • Hi JRV,

    Tried running the block as shown but it is not generating any CSV file just keeps on running for long time


    Regards, Manju Gowda

    Monday, July 30, 2018 5:43 AM
  • The following script retrieves audit search for longer time  by breaking it 15 minutes data.

    In this i want to add if the file size is growing more than certain MB it should also create a new file

    Please help

    =====

    $logFile = "C:\MyLog.txt"
    $outputFile = "C:\AuditRecords.csv"
    [DateTime]$start = "1/22/18 03:59"
    [DateTime]$end = "1/23/18 03:59"
    $record = "SharePointFileOperation"
    $resultSize = 1000
    $intervalMinutes = 15
    $retryCount = 3

    [DateTime]$currentStart = $start
    [DateTime]$currentEnd = $start
    $currentTries = 0
     
    Function Write-LogFile ([String]$Message)
    {
    $final = [DateTime]::Now.ToString() + ":" + $Message
    $final | Out-File $logFile -Append
    }
     
    while ($true)
    {
    $currentEnd = $currentStart.AddMinutes($intervalMinutes)
    if ($currentEnd -gt $end)
    {
    break
    }
    $currentTries = 0
    $sessionID = [DateTime]::Now.ToString().Replace('/', '_')
    Write-LogFile "INFO: Retrieving audit logs between $($currentStart) and $($currentEnd)"
    $currentCount = 0
    while ($true)
    {
    [Array]$results = Search-UnifiedAuditLog -StartDate $currentStart -EndDate $currentEnd -RecordType $record -SessionId $sessionID -SessionCommand ReturnNextPreviewPage -ResultSize $resultSize
    if ($results -eq $null -or $results.Count -eq 0)
    {
    #Retry if needed. This may be due to a temporary network glitch
    if ($currentTries -lt $retryCount)
    {
    $currentTries = $currentTries + 1
    continue
    }
    else
    {
    Write-LogFile "WARNING: Empty data set returned between $($currentStart) and $($currentEnd). Retry count reached. Moving forward!"
    break
    }
    }
    $currentTotal = $results[0].ResultCount
    if ($currentTotal -gt 5000)
    {
    Write-LogFile "WARNING: $($currentTotal) total records match the search criteria. Some records may get missed. Consider reducing the time interval!"
    }
    $currentCount = $currentCount + $results.Count
    Write-LogFile "INFO: Retrieved $($currentCount) records out of the total $($currentTotal)"
    $results | epcsv $outputFile -NoTypeInformation -Append
    if ($currentTotal -eq $results[$results.Count - 1].ResultIndex)
    {
    $message = "INFO: Successfully retrieved $($currentTotal) records for the current time range. Moving on!"
    Write-LogFile $message
    break
    }
    }
    $currentStart = $currentEnd
    }
    Remove-PSSession$Session

    ====

    Regards, Manju Gowda

    Monday, July 30, 2018 6:00 AM
  • Sorry.  I showed you how to do this but you seem to not understand how it works.  I will let you sort it out. Your original code has issues and we cannot provide free customization to your specifications. You will have to take the time to learn PowerShell and work it out for yourself.

    First learn how to correctly format and post code so that it is readable.


    \_(ツ)_/

    Monday, July 30, 2018 6:09 AM
  • Thanks jrv for the suggestion

    Regards, Manju Gowda

    Thursday, August 2, 2018 10:25 AM