Re: [isabelle] BUG in conv.ML: rewr_conv and then_conv fail due to beta-normalization issue

> > I consider this a bug, because it is nowhere documented that the user of 
> > conversions has to take care of beta-normalization manually.
> Despite tons of manuals, which are often hard to keep in overview anyway, 
> the real documentaion is the ML source:
> (* single rewrite step, cf. REWR_CONV in HOL *)
> fun rewr_conv rule ct =
>    let
>      val rule1 = Thm.incr_indexes (#maxidx (Thm.rep_cterm ct) + 1) rule;
>      val lhs = Thm.lhs_of rule1;
>      val rule2 = Thm.rename_boundvars (Thm.term_of lhs) (Thm.term_of ct) rule1;
>    in
>      Drule.instantiate_normalize (Thm.match (lhs, ct)) rule2
>        handle Pattern.MATCH => raise CTERM ("rewr_conv", [lhs, ct])
>    end;
> This says excplitly that your resulting rule instance will be produced by 
> Drule.instantiate_normalize, and thus in beta-normal form.  So you loose 
> if you try to make a plain then_conv step based on something that is not 
> in beta-normal form.

So what normalizations does instantiate_normalize do (beta?, eta?,
beta-eta?, how deep?). Looking at the source code does not really help
fun instantiate_normalize instpair th =
  Thm.adjust_maxidx_thm ~1 (Thm.instantiate instpair th COMP_INCR

Even if looking 4 levels deep from here (COMP_INCR -> COMP -> compose ->
bicompose), one finds no comment mentioning normal forms, nor any
function whose name would suggest any particular kind of normalization.

> If this is good or bad, or a "bug" is a completely different question. 
> Further judgement would require careful studies how a change would impact 
> existing application, and potential interaction with the Simplifier and 
> the rule composition mechanisms (RS etc.)  I can't say much on the spot, 
> apart from quoting the well-known "Mahabharata" software development 
> principle: "Dharma eva hato hanti / Dharmo rakshati rakshitah". In other 
> words, it might be easier to change your expectation about what the system 
> should do for you.

As Brian already mentioned: If there is an easy change that does not
break too much, and makes the system arguably more elegant, why not just
applying it?

> * Insisting too much in one way or the other causes trouble.

As my example shows, I have to insist on one way or the other (either
using something like fix-conv, or manually inserting beta_conv after all
conversions that may produce beta-redexes). 

> * It is possible not to insist too much.
> * What is your concrete application anyway?
Something similar as the minimal example. I need to do some
controllable rewriting in a term, replacing patterns like 
"fun_upd f k (Some v)" by "op_map_update$k$v$f" if certain
side-conditions on f, k, and v are met (infixr $ is a custom constant
defined as f$x == f x). Another pattern was 
"insert" by "\lambda x s. op_insert$x$s", which caused the beta-redex,
as I first replaced it by "\lambda x s. op_insert x s", and then wanted
to insert the $.
In my concrete case, it was easy to fix by applying beta_conv at the
right places, however, I wondered why conversions do not meet the simple
expectation that most of us seem to have (see Brian's mail). Indeed, it
required me more than an hour to track down why my code did not work, as
I implicitly assumed that the standard conversions behave as expected,
such that I searched at the wrong places first.


This archive was generated by a fusion of Pipermail (Mailman edition) and MHonArc.