Hacker News new | past | comments | ask | show | jobs | submit login

Can you show me some code? How do you write this in Perl?

    f() {
      echo --
      ls /
      echo --
    }

    f > out.txt
    f | wc -l



Here is probably the simplest answer, it's not totally correct but it's the shortest answer that fits the main criteria.

  #!/usr/bin/perl
  
  use strict;
  
  sub f
  {
    my $outputFH=shift;
  
    print $outputFH "--\n";
    open(my $lsFH,"ls /|") or die("pipe ls: $?");
    print $outputFH (<$lsFH>);
    close($lsFH);
    print $outputFH "--\n";
  }
  
  open(my $outTxtFH,">","out.txt") or die("open: out.txt:$?");
  f($outTxtFH);
  close($outTxtFH);
  
  open(my $wcFH,"|wc -l") or die("pipe wc: $?");
  f($wcFH);
  close($wcFH);


I'm not sure why you'd want a count that includes your delimiter lines nor why you'd want to run the binaries twice for that matter. Real programming languages, including Bash, have variables.

There are a number of ways to do these same things. Some of them mirror your code more closely than others. Here's my first shot using a core module, since someone already did one with no modules that works much like your code.

    use IPC::Run3;
    my @lines;

    sub f {
        my @command = qw( ls / );
        run3 \@command, \undef, \@lines;
    }

    f();
    open my $out, '>','out.txt' or warn "can't write to out.txt : $!\n";
    printf $out "--\n%s--\n", (join '', @lines);
    print scalar @lines . "\n";


Now I'd make that a bit cleaner and more reusable of course. I'd probably take the commands to run from the command line or a configuration file. I'd probably return an array or use a reference to one rather than making a file-level lexical array and just using that from a subroutine.


I tried not to make it too golfish, but dispensed with niceties such as error detection and somesuch (which aren't there in the bash version - still autodie will catch most snafus). I also joined a few lines to make it closer to what's happening in the shell version. No doubt experienced golfers could make it tighter/shorter, but methinks that's not the point of the exercise.

    #!/usr/bin/perl -w
    use strict;
    use English;    
    use autodie;
    
    $OFS=$ORS="\n";

    sub f { my $h ; opendir($h,$_[0]) ; print "--",(readdir($h)),"--"; closedir($h);}

    my $out;

    open($out,">/tmp/out.txt") ; select $out ; f("/tmp");close($out);
    open($out,"| wc -l ")      ; select $out ; f("/tmp"); close($out);

    select STDOUT;
Would I use perl/python to write this kind of stuff? 'course not. Why would I go through the opendir rigmarole, if all I really need is 'ls'. But there are zillions of (non application) tasks where bash's syntax gets very quickly unwieldy (think filenames with blanks, quoting quotes, composing pipes programmatically, having several filehandles open at once...) while perl shines. And you can still throw the occasional

@ary=split("\n",`ls`);

around if you feel so inclined.


And just to be cute, this uses 2 pipes and is shorter (but I would not write it this way).

    #!/usr/bin/perl -w
    use strict;
    use English;
    use autodie;
    
    sub f { open(my $h,"/bin/ls $_[0]|") ; print "--\n",(<$h>),"--\n";}
    
    open(my $o,">/tmp/out.txt") ; select $o ; f("/tmp") ;
    open($o,"| wc -l ")         ; select $o ; f("/tmp") ;


Assuming we have some sequence of commands whose output we want to capture and eliminating any implicit use of the shell from perl, I’d define a sub along the lines of

    sub output_of {
      my(@commands) = @_;

      my $pid = open my $fh, "-|" // die "$0: fork: $!";
      return $fh if $pid;

      for (@commands) {
        my $grandchild = open my $gfh, "-|" // die "$0: fork: $!";
        if ($grandchild) {
          print while <$gfh>;
          close $gfh or warn "$0: close: $!";
        }
        else {
          exec @$_ or die "$0: exec @$_: $!";
        }
      }

      exit 0; # child
    }
Call it as with

    my $fh = output_of [qw( echo -- )],
                       [qw( ls   /  )],
                       [qw( echo -- )];

    while (<$fh>) {
      print "got: $_";
    }

    close $fh or warn "$0: close: $!";
If implicitly using the shell is acceptable, but we want to interpose some processing, that will resemble

    my $output = `echo -- ; ls / ; echo --` // die "$0: command failed";
    chomp $output;

    print "$0: lines = ", `echo '$output' | wc -l`;
This becomes problematic if the output from earlier commands collides with the shell’s quoting rules. This lack of “manipulexity” that we quickly bump into with shell scripts — that are otherwise great on the “whipuptitude” axis — was a common frustration before Perl. The gap between C and the shell is exactly the niche on POSIX systems that Perl occupies and was its initial motivation.

If all you want to do is redirect anyway, run

    system("{ echo -- ; ls / ; echo -- ; } > out.txt") == 0
      or die "$0: command failed";
Use the appropriate tool for the job. Perl was not designed to replace the shell but to build upon it. The shell is great for small programs with linear control flow. It’s hard to beat the shell for do-this-then-this processing. The real world likes to get more complex and nuanced and inconsistent, however.

Maybe I am missing your point entirely. Do you have a more concrete example in mind?


See this thread for the real problem: https://www.reddit.com/r/oilshell/comments/7tqs0a/why_create...

Sorry I got to this late -- I might do a blog post on it. I think your response, along with the 3 or 4 others i got essentially proves my point: "Perl is not an acceptable shell".


You're not saying why any of the above don't solve your problem. Dismissing a solution because you refuse to understand it doesn't prove anything.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: