Re: [isabelle] Large theories

Dear Stanislav,

I had a similar problem when checking some objects that came out of the Kepler conjecture proof where I had 5 definitions of lists of (in total) 20.000 graphs. Isabelle choked on those when inputting them as an HOL definition. This is what I did instead:

1. I first defined those lists in ML files so they could be read by by the PolyML parser. Then I used some magic to import those ML definitions as HOL definitions. Here is the magic code:
Code_Runtime.polyml_as_definition performs that conversion. You would need to adapt this to your own situation. It worked because the types used on the ML existed on the HOL level: integers and lists.

2. To reduce the time to check the objects I wrote the checker in the executable subset of HOL. The final theorems
were proved with "eval", a proof method that compiles executable theorems to ML and runs them there, i.e. reduces them to true or false. (The fact that the definitions were initially made in ML is orthogonal to the final proof by ML.)


On 06/06/2016 20:44, ÐÑÐÐÐÑÐÐÐ ÐÐÐÐÐÐÐÑÐÐÐÑ ÐÐÐÑÐÐÐ wrote:

I'm working on a verification of properties of a large set containing
approximately 2 million objects. Each object is relatively simple.
These objects were constructed by a computer program; now I want to prove
that the result of the program is correct.
My plan is
  1. to define 2 million objects.
  2. to prove properties of every object (that is I need 2 million lemmas).
Each lemma has <=500 steps, each step is relatively simple. That is, I need
approx. 500*2mln = 1billion simple steps.
I'm going to generate the proofs for all that lemmas by another computer

Has anybody worked with specifications of this size?
Is it practical to conduct a proof of this size in Isabelle/HOL? If yes,
can you recommend how to organize the proof? If no, can you suggest other
verification tools?

My experiments show that it is impractical to define a set of 2 million
objects in Isabelle/HOL (Isabelle allocates a lot of memory and the
computation time is large).
I found that many small-size definitions are easier to handle than one
large definition. E.g. I was able to define 20 000 sets a1,...,a20000
containing 100 elements and then define 200 sets b1,...,b200 (each being a
union of 100 sets a_{100*i+1},...,a_{100*i+200}), and finally define a set
C as a union of b1,...,b200. But for aesthetic reasons I don't like it.
Moreover, given the speed of Isabelle, I'm not sure that proofs of 2
million lemmas can be conducted in a reasonable time (say, less than two

Any ideas?


Attachment: smime.p7s
Description: S/MIME Cryptographic Signature

This archive was generated by a fusion of Pipermail (Mailman edition) and MHonArc.