locked
My script needs to be more flexible RRS feed

  • Question

  • Good Morning and Good Afternoon,

    I am extremely new to scripting with powershell. I actually don't  even have any formal training in scripting other than what I have  learned through the web.  My boss asked me to create a script that would go through a log file and extract data based on a pattern.  

    For example, my log file looks something like this: 

    "(09:20:35.63300) SLS(1244.13): Total 0.01126 (s), Query 0.01096 (s), 1 hits for User=cryst1t,'Query=FTP,Country=CAN,Asset=
    EQUITY,FUTURE,FOREX,INDEX,Exact=False'
    (09:21:01.74700) SLS(1244.5): Total 0.01840 (s), Query 0.01816 (s), 1 hits for User=a'Query="ITGID"="SYD.AX",Countr
    y=AUS,Asset=EQUITY,Exact=True'
    (09:21:01.72800) SLS(1244.5): Total 0.01331 (s), Query 0.01295 (s), 1 hits for User=,'Query="ITGID"="ALQ.AX",Countr
    y=AUS,Asset=EQUITY,Exact=True'
    (09:20:45.56100) SLS(1244.11): Total 0.01039 (s), Query 0.00996 (s), 1 hits for User='Query="ITGID"="SBAC",Countr
    y=USA,Asset=,Exact=True'"

    I was able to create a script that removes everything from total and query and compute the max min count and average 

    my script looks like this 

    $a  = (Get-Content C:\Users\jmateo\TestProjects\SLS_Test.txt) | Select-String -Pattern  "total" 
    $a2 = (($a -replace(".*:",""))-replace("hits for.*", ""))-replace("total", " ")-replace("query", "")-replace('[(s)]', "")
    $a3 = ConvertFrom-Csv -InputObject $a2 -Delimiter "," -Header Totals, Query | Measure-Object -Average Totals, Query -Maximum -Minimum -Sum 
    $a3 | Format-Table -AutoSize -Expand Both -Force -Wrap

    the output looks like this , which is exactly what I want, but this script is not flexible enough to really sort through another text file and do the same

    Count            Average              Sum Maximum Minimum Property
    -----            -------              --- ------- ------- --------
     3501 0.0137331905169952 48.0799000000001 0.90181 0.00017 Totals  
     3501 0.0132653413310483         46.44196 0.90109       0 Query  

    I went ahead and created a new script that does the following 

    Param(
      [string]$enter_path, [string]$Select_Filter)
      $Enter_path = Read-Host "Please Enter the File Path"
    $select_filter = read-host "Please enter filter"
    $object = Get-Content $enter_path | select-string -pattern $Select_Filter 
    $object

    Here, I am asking a user to  enter a path name followed by a filter which in this case if I want to just find all instances that have the word "total" when I run this I get something like this:    

    "User=,'Query=""="CAG",Country=USA,Asset=,Exact=True'
    (13:07:57.01700) SLS(1244.5): Total 0.00078 (s), Query 0.00045 (s), 1 hits for 
    User=,'Query=""="MAA",Country=USA,Asset=,Exact=True'
    (13:07:57.01900) SLS(1244.5): Total 0.00044 (s), Query 0.00024 (s), 1 hits for 
    User=,'Query=""="SYK",Country=USA,Asset=,Exact=True'
    (13:08:04.31000) SLS(1244.5): Total 0.04163 (s), Query 0.04121 (s), 4 hits for User=,'Query="ISIN"="US6026751007" "TICKER"="MR" 
    "SEDOL"="B1FCP24" "CUSIP"="602675100",Country=USA,Asset=EQUITY,Exact=False'"

    This works if I just want to search for a word, but my question is how can I make this script flexible enough to basically grab the numerical value that comes after total and query like my first script except with all of the -replace(), because I just my script through read through data and grab and filter out 

    The file is a .txt. It is basically one string. I figured  out that if  I could tab or comma delimit the file I could simply specify a header and do my my computations, but the problem is is that the log file does not have equal parts, so how would I be able to divide this log file into equal parts with equal headers, so that I could just select the header with numerical values and do my computations. I been working on this for three weeks, but feel I do not know enough to create a more flexible script that does use -remove

    thanks,

    Jeff

    • Moved by Bill_Stewart Tuesday, July 21, 2015 9:08 PM This is not "scripts on demand"
    Friday, June 19, 2015 3:28 PM

Answers

All replies