- I can have a prototype running in a matter of minutes
- I have access to a vast library of modules doing everything I need (which helps the former)
- I write quick and efficient code
Do I really care about what other people think about the languages and tools I use? No.
However, I do care about efficiency and results
And, uh, working with a language that has been out for tens of years makes me believe my time investment is safer than with the fad-du-jour JS framework/library/whatever.
BTW, Maybe buying a home in Detroit (especially now that it's officially bankrupt and will hopefully be able to break out of the unions commitments) would be a good business move? That's a cheap place to bootstrap a business for sure.
The benchmarks just do not support. I mean, roughly in parity with Python and PHP, 20x slower than Clojure on average, 30x slower than Haskell, and on many tasks 100x slower than C. Over the whole bredth of the Alioth shootout programs, Perl trades places with JRuby as the slowest tested language.
The Alioth benchmarks do a really bad job of capturing the reality of language-related performance issues. The choice of specified algorithm is going to dominate all other concerns in their benchmark, and that's not the sort of problem people face in the real world, perhaps out of rarefied HPC circles where benchmarks on one quad-core machine aren't relevant to anything.
Performance handling in high level languages is more about knowing where the landmines are (total heap usage in GCed languages, various troublesome malloc()s, weird caches in dynamic languages...) and trying to figure out as quickly and cheaply as possible if the program you're planning on writing is going to hit them, and how much it's going to cost to mitigate it. It's not relevant if the actual runtime overhead of the problem is 5% or 1000x, the end product is either fast enough or it isn't, and the workaround is either viable or it isn't.
Trying to protect yourself from this by picking the fastest language in a shootout isn't going to work because the perfect, gotcha-free language hasn't been invented yet. There are sound performance reasons to avoid any language, not just Perl.
Most of what I do involves loading text files with fixed width records. A file is between 500M and 4Gb. Then I do things with these records.
So far I have not found anything faster than perl unpack (1), but I would be happy to. I plan to investigate Go soon.
Would you have another suggestion?
1 : or marginally faster, with the advantages negated by the time it takes to write the code. However, a 10x improvement would be very interesting to me.
I think it's important to note that there is an implication in his question of the code of the alternative being as easy to read and write as Perl. One could of course write a much faster text processing tool in Assembler, but well ...
Yes that is correct. Brainfuck is not as readable as Perl. (I think you need to train more with your witty comebacks. Also drop the meme that Perl is not very readable, when it has lately improved a LOT [1] in that regard.)
But actually look at the SOURCE of Moo instead of a toy example, and I see plenty of stuff that's exactly the kind of suboptimal readability I associate with perl....sigils all over the place, @_, the lack of proper function argument handling, etc.
Yeah, sorry man, at the point where you complain about sigils you've disqualified yourself from the discussion for lack of thinking things through far enough. Thanks for playing, bub.
Why, "bub", most sane languages don't have such cryptic, hard to read incantations. But you're obviously a zealot with a closed mind so I'm not sure why I'm even bothering to reply.
Using words like "sane" and "cryptic" and "incantations" doesn't really seem to fit an argument in good faith, but assuming you're sincere, I can explain.
Perl uses sigils for two reasons. First, to provide a separate namespace for variables and language keywords. Second, to allow amount contexts.
Reasonable people will disagree about the value of both of those reasons, but those are the reasons why sigils exist in Perl.
Apparently one of the mods decided i shouldn't be able to reply, thus different account. Here's a serious answer for you:
You complained about the simple presence of sigils, without even stopping to think and consider what advantages they bring. I may be rude, but close-minded? No.
First off, in Perl the parens for a function call are not mandatory.
As such it is very useful for functions and variables to be visually difference, especially when you're using functions to generate parameters for other functions without any temp variables inbetween.
my W = qq_mult ConjugateQ, qv_mult RotationQ, Vector;
You can't tell at a glance what's going on and will need to look carefully. Adding sigils makes it quite clear:
my $W = qq_mult $ConjugateQ, qv_mult $RotationQ, $Vector;
Further, most languages have only one type of variable, a name for a single thing. Perl has multiple types of variables that behave differently. For example:
my $res = munge $one;
I see this and know that the function munge is passed one single variable. However, consider this:
my $res = munge @two;
If there wasn't the @ there, it would be easy to assume that munge gets exactly one argument. However the @ there alerts us to the fact that @two is an auto-flattening array, and munge will end up with anywhere between 0 and MAX_INT arguments passed to it.
Similarly with hashes:
my $res = munge @two;
They also auto-flatten, so they need to be marked as being different from scalars, but they also flatten in a very different way from array, in that they flatten into a list that alternates the keys and values. So they need to be marked differently from arrays.
Lastly, due to functions, variables, array and hashes having explicit sigils, they can be recognized by editors without any heavy analysis, enabling editors to mark these four types with different colors, which is extremely useful. Personally i don't see the sigils anymore, and instead just see the colors with which the types are highlighted.
(Bonus set: Actually C is often written with sigils too. I've often seen code where variables are prefixes with p_, s_, a_, i_, etc. They are not enforced by the language, but people often force themselves to use them. The downside: They are inconsistent from project to project and have to be relearned every time.)
> You can't tell at a glance what's going on and will need to > look carefully. Adding sigils makes it quite clear:
> my $W = qq_mult $ConjugateQ, qv_mult $RotationQ, $Vector;
Actually, that's clear as mud. Is this a function call of 3 arguments? Function composition/chaining, an array of 3 items?
You're right. I did not explain that well enough. Even with sigils it is unclear whether qv_mult will take 1 or 2 arguments, or whether qq_mult will take 1, 2 or 3. What is however clear is that only qq_mult and qv_mult are function calls, which it previously was not. Worse, what i did not mention then: When reading carefully it was easy to tell that they had to be function calls; however it was impossible to tell whether the two quaternions and the vector were variables, or function calls that did not take any arguments. Only with sigils becomes this clear.
That example doesn't show it very well. To be clear: The intended advantage is: Parens are optional. That permits syntax like the following to be implemented in pure Perl without changing the parser:
If parens were mandatory, it would need look like this:
try(
sub {
die "foo";
}
catch(
sub {
warn "caught error: $_";
}
)
);
Having the nicer form in the first example is possible only because sigils are mandatory, along with auto-flattening variables and easy code highlighting.
As for the example, with mandatory parens it would look like this:
w = qq_mult( ConjugateQ, qv_mult( RotationQ, Vector ) );
In production i'd write it like this:
my $W = qq_mult $ConjugateQ, qv_mult( $RotationQ, $Vector );
That gives a nice balance between a low number of parens, and clarity of intent.
"It is not however an effective rebuttal to the points brought up in a more general case."
Where is this general case? I clicked thru the presentation and it boils down to here are some obscure yet interesting and trendy corners of the computational world where Perl doesn't work well, therefore we have to change everything.
We could have had the same presentation 20 years ago, just the trendy weird corners would have to change to match the times. Doesn't mean those trendy corners are useless, just not required to write 'good code'.
Not that the goals are inherently awful. I don't do anything requiring UTF-8 at work. Someday I probably will, and it'll probably be a PITA.
I tend to see this kind of presentation as rabble rousing. "Here's some stuff that other languages care about, although you don't care, but it obviously doesn't prevent any of you from being profitable and gainfully employed writing good code so its obviously not required, which is why Perl doesn't have it. If I can convince you to care about it, Perl would gain it or more likely you'd just leave Perl. Hope you feel uplifted, here's some internet meme pictures, k thx bye" I'm not sure other languages consider it "insightful" when they get a cut and paste rephrasing of the same presentation, which is possibly the most interesting part of the whole story.
Well, what does "efficient" actually mean here? I like Perl (for medium-small programs) because it's quick to write, doesn't require lots of boilerplate like Java does, and allows a much more direct translation of my intent than shell scripts do.
Personally, if sounds like you need to expand your horizons if you feel that Perl is mostly competing with Java or shell scripts as the best language for the job.
Shell scripts are for things that have almost no logic besides running other programs.
Perl is for things that do have internal logic, and aren't big enough for the lack of formal function parameters to be an issue.
Java is for things that talk to the database (or need other libraries, such as for reading/writing Excel files), or are too large to keep in my head all at once (most of which talk to the database anyway).
C is for the one program I have that needs the suid flag set, and the one wrapper that resets the process group ID. I don't like C.
The server that all these need to run on, is an AIX box that doesn't have Python or OCaml or Ruby installed. If it did, I suppose it's possible I'd use one of these for some of the cases where I use Perl or Java now. Or probably not, since less people know them (except maybe Python?).
PL/SQL is for things that run inside the database, because it's what the database (Oracle) comes with.
C#/.NET is for things that run on my and my coworkers' laptops, because Visual Studio provides a wrapper/installer that makes it dead simple to publish updates.
There's a general idea that compiled programs are safer for the SUID flag because it's 'easier' to hijack a script (running in an interpreter) to execute arbitrary code than code that's compiled to machine language.
When I look Alioth I really don't see this, particularly the "slowest but JRuby" claim. What is your methodology?
---
Edit:
What I do see is all the fastest test program for the interpreted languages, ruby, python, etc are all trading places for the slowest language - the differences seem to be completely minimal. (As there seem to be more perl test programs per benchmark the slowest of them is often near the bottom - perhaps that was your methodology.)
That kind of shows my point. (Although not as well as some of the others.)
On that particular benchmark Perl is faster than the fastest JRuby, and Python benchmarks. And faster than 11 other benchmarks, but the secondary perl benchmark program is the slowest of the bunch...
It's a log scale though, so it doesn't capture the magnitude of how much slower it is. When your argument is that you're 100x instead of 103x slower than C, that's not very convincing.
Not if it matters much, but I have no problem with the argument that some things should be written in C or on top of the JVM for performance reasons.
My problem is with people claiming that some popular scripting language, say Ruby, is massively faster than all the rest. Which as a general rule is simply not true (although it may be the cases, for some well defined set of problems).
When flipping through them, note that Perl is consistently faster than other dynamic languages when dealing with text while also being very concise in code (generally a sign for code that can be written quickly).
I don't make money writing benchmarks. I would challenge that given a flexible enough language, they're nearly meaningless.
Another very important point is most of my Perl is glue between enormous systems. Spending any time optimizing it would be a waste of time/money. I'm not writing real time video encoding here.
I can have a prototype running in a matter of minutes
That's the case for any microframework whose language you're comfortable in.
BTW, Maybe buying a home in Detroit (especially now that it's officially bankrupt and will hopefully be able to break out of the unions commitments) would be a good business move? That's a cheap place to bootstrap a business for sure.
Not really because part of having a business is staying alive :)
In 10 years you could say exactly the same thing, except that:
- Perl 6 will STILL be in development
- You just lost 10 years of development experience and related marketability because now all those fad-du-jour systems are established industry leaders
- Nothing would have have changed back in Perl 5 land because everyone can only develop in 5.8.x or 5.10.x in case they break something.
The upside of course is that there will still be COBOL and perl programmers making a living ...
"all those fad-du-jour systems are established industry leaders"
I'm trying to think of what the fads were in 2003. That was the tail end of extreme programming. Kernel03 or kernelcon or whatever it was called had a presentation about replacing devfs with udev since adsorbed into systemd. Simultaneously the new JACK system running on top of ALSA was going to take over sound on the desktop. In 2003 Moodle was the new leader of online course software. I would claim PHP4 was a very popular web programming language in 2003. Firefox was beginning its explosive growth. Bonjour/Rendezvous was new and making noise in networking. By far the most dominant handheld computing platform was the Palm, I still had my couple year old 3 and my wife had one of the M100 series. Dot-net and/or mono were going to take over the entire world; we're still waiting. I believe by this late the cuecat had already sunk along with most of the dotcom era stuff.
At this point I'm running out of ideas from 2003. I don't think "all" would be a good adjective. I'm having an easy time remembering fads that took off, and forgetting things that sank.
Nothing would have have changed back in Perl 5 land because everyone can only develop in 5.8.x or 5.10.x in case they break something.
Seems doubtful to me. With yearly releases and a two year support cycle, only the enterprisey distribution nonsense is still infected with that particular brokenness.
- I can have a prototype running in a matter of minutes
- I have access to a vast library of modules doing everything I need (which helps the former)
- I write quick and efficient code
Do I really care about what other people think about the languages and tools I use? No.
However, I do care about efficiency and results
And, uh, working with a language that has been out for tens of years makes me believe my time investment is safer than with the fad-du-jour JS framework/library/whatever.
BTW, Maybe buying a home in Detroit (especially now that it's officially bankrupt and will hopefully be able to break out of the unions commitments) would be a good business move? That's a cheap place to bootstrap a business for sure.