none
Off-Topic RRS feed

  • General discussion

  • sorry to necro a 7 year old thread, but since it was a top result, I thought I'd add what worked for me.

    When I was trying something like this, I found that it was the encoding that was messing things up

    Adding "-encoding UTF8" to the end of your out-file makes it come up in excel as expected. something like:

    "COL1,COL2,COL3,COL4" | out-file $Outfile -encoding UTF8

    it's more of a quick and dirty way as opposed to the more object oriented way that's described in the marked answer

    • Split by jrv Friday, May 17, 2019 10:46 PM off topic
    Friday, May 17, 2019 8:25 PM

All replies

  • sorry to necro a 7 year old thread, but since it was a top result, I thought I'd add what worked for me.

    When I was trying something like this, I found that it was the encoding that was messing things up

    Adding "-encoding UTF8" to the end of your out-file makes it come up in excel as expected. something like:

    "COL1,COL2,COL3,COL4" | out-file $Outfile -encoding UTF8

    it's more of a quick and dirty way as opposed to the more object oriented way that's described in the marked answer

    Not true.  CSV files can be ASCII UTF-8 or Unicode.  All will work just fine.

    Export-Csv is the preferred method since it can correctly quote required fields.  The encoding for this command is ASCII by default.

    You need to do more experimenting to learn what you may have missed or done wrong to come to your incorrect conclusion.  It will help you better understand these things.


    \_(ツ)_/

    Friday, May 17, 2019 9:27 PM
  • I have confirmed that an ansi encoded csv file works better with excel 2016 than a unicode encoded csv file, exactly as people have been describing.

    ps | select name,id | ConvertTo-Csv -NoTypeInformation > testu.csv
    ps | select name,id | ConvertTo-Csv -NoTypeInformation | set-content testa.csv

    • Edited by JS2010 Friday, May 17, 2019 10:57 PM
    Friday, May 17, 2019 10:56 PM