From a performance point of view you probably want to avoid Import-Csv
/Export-Csv
and go with a StreamReader
/StreamWriter
approach. Something like this:
$inputFolder="C:\some\folder"
$outputFile="C:\path\to\output.csv"
$writer = New-Object IO.StreamWriter ($outputFile, $false)
Get-ChildItem $inputFolder -File | Where-Object {
... # <-- filtering criteria for selecting input files go here
} | ForEach-Object {
$reader = New-Object IO.StreamReader ($_.FullName)
if (-not $headerWritten) {
# copy header line to output file once
$writer.WriteLine($reader.ReadLine())
$headerWritten = $true
} else {
# discard header line
$reader.ReadLine()
}
while ($reader.Peek() -ge 0) {
$line = $reader.ReadLine()
$fields = $line -split ','
if (...) { # <-- filtering criteria for selecting output lines go here
$writer.WriteLine($line)
}
}
$reader.Close()
$reader.Dispose()
}
$writer.Close()
$writer.Dispose()
3
solved Filtering and Merging Many Large CSV Files [closed]