Re: [isabelle] 64-bit Java is 6x faster than 32-bit for a recursive fibonacci



On 14-05-17 18:25, James McDonald wrote:
More generally, if you are really trying to benchmark various languages and implementations, you might want to consider a broader suite of tests. E.g., within the lisp world, you could look at Dick Gabriel’s “Performance and Evaluation of Lisp Systems” from around 1985.
(http://www.dreamsongs.com/NewFiles/Timrep.pdf)

James (and Makarius below),

Thanks for the specific optimizations. Right now, I'm doing rough comparisons between languages, but all that could come in handy if I go the Lisp route, and I'm not seeing anything other than the Lisps that don't tend to get slaughtered by big integers, though Lisp may get slaughtered in other ways. I wouldn't know.

And thanks for the PDF link also. I used the TAK test out of it, since it was short and easy.

I show times for Isabelle/ML, Haskell, SBCL, Clojure, Scala, and JRuby, three of them with both a 32-bit and 64-bit version. At least for the three tests, SBCL comes out ahead overall. Enough to warrant more looking in to.

http://github.com/gc44/prelim/tree/master/1400/1405A_bigint_prog_lang_perf_tests


On 14-05-16 15:44, Makarius wrote:
On Fri, 16 May 2014, Gottfried Barrow wrote:
A starting point is to try to and find as many languages that perform good with recursive functions that use pattern matching.
You should look in the vicinity of ML and Haskell. The latter is particularly well-known to beat most other languages in the usual benchmarks.

This is where it pays to sometimes do my own tests. I've searched multiple times to get comparisons between Scala and Haskell. I've read repeatedly, and seen on the benchmark sites, that Haskell is comparable, but a little slower than Scala.

Obviously, they're not talking about the Windows 32-bit version, because it's slow. Not a little slow, but a lot slow, relative to my own limited tests. I show that on pages at the link above.

If a recursive fibonacci function isn't relevant, then why is Isabelle/HOL filled with impractical recursive functions?
There is nothing impractical about recursion. The slowness of the naive fibonacci implementation is caused by the non-linearity of the invocation, which causes an exponential blowup.

But surely blowups are good for stress tests.

I know that recursive functions are bad performers, and recursive functions with pattern matching are even worse.
This was disproven in the 1980s or so, when the first efficient implementations of ML came about. Recursion without pattern matching was already known to be efficient much earlier, in the olden days of LISP.

I believe you, but "implementation" is the key word, where "implementation" is on my mind only because some Stackoverflow answerer was making sure everyone knows that languages aren't slow, but implementations are slow. Coming from that guy it was very irritating. Coming from me, it's profound.

I've been doing two things on multiple languages. Running the if/then form of the fibonacci function, and then hunting down how to do it with pattern matching. Most of the time, pattern matching has been slower.

It was even slower for Haskell. I'm looking right now at where I put the time in filenames, that it was 36s for if/then/else, and 51s for pattern matching. But the Windows 32-bit version of HaskellPlatform is slow. It's messed up, so that may not mean anything. I check these things multiple times, and want to check them again, because it's easy to get confused in all the details.

With Erlang, on the other hand, it was significantly faster with pattern matching. I don't think they even have a normal if/then/else statement.

I want as much speed as possible, because I think I'll need it.

I have two datatypes I want to test out, at least at some point, that point not being way off in the future. My first guess is that one of them will be slow, but I need to find out, to make some decisions.

I added two more tests, so now I'm doing three times more testing than what I was doing, which is infinitely more than what I was doing before, which, though not zero, was still only a small number infinitely close to zero, call it epsilon.

I thought about things a little more because you brought these things up, but there's no room for that here.

Regards,
GB





This archive was generated by a fusion of Pipermail (Mailman edition) and MHonArc.