添加链接
link之家
链接快照平台
  • 输入网页链接,自动生成快照
  • 标签化管理网页链接
yes, the problem is that with big files can be really slow, 'cause all the file is read before return the [index] CB. Feb 7 '13 at 20:10 improve this answer See C.B.'s answer for performance stats (time, not memory). If you're working with large files, then this is inefficient. If you're using small and just a few files, then performance isn't that important. Sometimes clean code is more important. This answer is 5 years old - things have changed in powershell as well Frode F. Oct 5 '18 at 14:24

This will show the 10th line of myfile.txt:

get-content myfile.txt | select -first 1 -skip 9

both -first and -skip are optional parameters, and -context , or -last may be useful in similar situations.

improve this answer This will work well for small files. Unless something has changed, Get-Content will read the entire file into memory. That does not always work well with large files. lit Oct 23 '17 at 23:47

You can use the -TotalCount parameter of the Get-Content cmdlet to read the first n lines, then use Select-Object to return only the n th line:

Get-Content file.txt -TotalCount 9 | Select-Object -Last 1;

Per the comment from @C.B. this should improve performance by only reading up to and including the nth line, rather than the entire file. Note that you can use the aliases -First or -Head in place of -TotalCount.

improve this answer

Here's a function that uses .NET's System.IO classes directly:

function GetLineAt([String] $path, [Int32] $index)
    [System.IO.FileMode] $mode = [System.IO.FileMode]::Open;
    [System.IO.FileAccess] $access = [System.IO.FileAccess]::Read;
    [System.IO.FileShare] $share = [System.IO.FileShare]::Read;
    [Int32] $bufferSize = 16 * 1024;
    [System.IO.FileOptions] $options = [System.IO.FileOptions]::SequentialScan;
    [System.Text.Encoding] $defaultEncoding = [System.Text.Encoding]::UTF8;
    # FileStream(String, FileMode, FileAccess, FileShare, Int32, FileOptions) constructor
    # http://msdn.microsoft.com/library/d0y914c5.aspx
    [System.IO.FileStream] $input = New-Object `
        -TypeName 'System.IO.FileStream' `
        -ArgumentList ($path, $mode, $access, $share, $bufferSize, $options);
    # StreamReader(Stream, Encoding, Boolean, Int32) constructor
    # http://msdn.microsoft.com/library/ms143458.aspx
    [System.IO.StreamReader] $reader = New-Object `
        -TypeName 'System.IO.StreamReader' `
        -ArgumentList ($input, $defaultEncoding, $true, $bufferSize);
    [String] $line = $null;
    [Int32] $currentIndex = 0;
        while (($line = $reader.ReadLine()) -ne $null)
            if ($currentIndex++ -eq $index)
                return $line;
    finally
        # Close $reader and $input
        $reader.Close();
    # There are less than ($index + 1) lines in the file
    return $null;
GetLineAt 'file.txt' 9;

Tweaking the $bufferSize variable might affect performance. A more concise version that uses default buffer sizes and doesn't provide optimization hints could look like this:

function GetLineAt([String] $path, [Int32] $index)
    # StreamReader(String, Boolean) constructor
    # http://msdn.microsoft.com/library/9y86s1a9.aspx
    [System.IO.StreamReader] $reader = New-Object `
        -TypeName 'System.IO.StreamReader' `
        -ArgumentList ($path, $true);
    [String] $line = $null;
    [Int32] $currentIndex = 0;
        while (($line = $reader.ReadLine()) -ne $null)
            if ($currentIndex++ -eq $index)
                return $line;
    finally
        $reader.Close();
    # There are less than ($index + 1) lines in the file
    return $null;
GetLineAt 'file.txt' 9;
        
            
                    improve this answer
                Overengineering: See BACON's solution on SO for a quick way to read a text file. :)
                    – northben
                Feb 8 '13 at 16:30
                I stumbled on this question while looking for how to do this for a large file - exactly what I needed.
                    – Tao
                Oct 22 '13 at 13:08
                @Tao Thank you.  Glad someone found this useful.  Sometimes the built-in PowerShell cmdlets don't give you the control or efficiency you need, especially, like you said, when working with large files.
                    – BACON
                Oct 22 '13 at 15:50
                +1 For Northben's (funny) explanation for overengineering. +1 For Bacon for his effort.
                    – prabhakaran
                Mar 26 '14 at 12:27
Milliseconds      : 893
Ticks             : 288932649
TotalDays         : 0,000334412788194444
TotalHours        : 0,00802590691666667
TotalMinutes      : 0,481554415
TotalSeconds      : 28,8932649
TotalMilliseconds : 28893,2649
> measure-command { (gc "c:\ps\ita\ita.txt")[260000] }
Days              : 0
Hours             : 0
Minutes           : 0
Seconds           : 9
Milliseconds      : 257
Ticks             : 92572893
TotalDays         : 0,000107144552083333
TotalHours        : 0,00257146925
TotalMinutes      : 0,154288155
TotalSeconds      : 9,2572893
TotalMilliseconds : 9257,2893
> measure-command { ([System.IO.File]::ReadAllLines("c:\ps\ita\ita.txt"))[260000] }
Days              : 0
Hours             : 0
Minutes           : 0
Seconds           : 0
Milliseconds      : 234
Ticks             : 2348059
TotalDays         : 2,71766087962963E-06
TotalHours        : 6,52238611111111E-05
TotalMinutes      : 0,00391343166666667
TotalSeconds      : 0,2348059
TotalMilliseconds : 234,8059
> measure-command {get-content .\ita\ita.txt | select -index 260000}
Days              : 0
Hours             : 0
Minutes           : 0
Seconds           : 36
Milliseconds      : 591
Ticks             : 365912596
TotalDays         : 0,000423509949074074
TotalHours        : 0,0101642387777778
TotalMinutes      : 0,609854326666667
TotalSeconds      : 36,5912596
TotalMilliseconds : 36591,2596

the winner is : ([System.IO.File]::ReadAllLines( path ))[index]

improve this answer @Graimer Added :). All this test are intended for seeking in a big file for big index, I think that for little index's value the results may vary. Each test was done in a new powershell session to avoid HD pre-caching features. – CB. Feb 8 '13 at 9:58 I'm really surprised that ReadAllLines() is not only faster, but so much faster than the two uses of Get-Content. As the name suggests, it's reading the entire file, too. Anyways, I posted another approach, if you want to try that one, too. Also, whenever I use Measure-Command to benchmark code I usually run it like this 1..10 | % { Measure-Command { ... } } | Measure-Object TotalMilliseconds -Average -Min -Max -Sum; so I can get a more accurate number from multiple test runs. – BACON Feb 8 '13 at 16:22

To reduce memory consumption and to speed up the search you may use -ReadCount option of Get-Content cmdlet (https://technet.microsoft.com/ru-ru/library/hh849787.aspx).

This may save hours when you working with large files.

Here is an example:

$n = 60699010
$src = 'hugefile.csv'
$batch = 100
$timer = [Diagnostics.Stopwatch]::StartNew()
$count = 0
Get-Content $src -ReadCount $batch -TotalCount $n | %  { 
    $count += $_.Length
    if ($count -ge $n ) {
        $_[($n - $count + $_.Length - 1)]
$timer.Stop()
$timer.Elapsed

This prints $n'th line and elapsed time.

improve this answer