Propositions | Types |
---|---|
Proof | Value / program |
Implication | Function |
Conjunction | Product |
Disjunction | Sum |
Proof verification | Type checking |
Proof construction | Programming |
Proof normalization | Program evaluation |
From one of Philip Wadler's talks:
Logic | Types |
---|---|
Natural deduction | STLC |
(Gentzen 1935) | (Alonzo Church 1940) |
Type schemes | ML type system |
(Hindley 1969) | (Robin Milner 1975) |
System F | Polymorphic LC |
(Girard 1972) | (Reynolds 1974) |
Modal logic | Monads |
(Lewis 1910) | (Kleisli 1965, Moggi 1987) |
Classical-Intuitionistic embedding | CPS |
(Gödel 1933) | (Reynolds 1972) |
Linear logic | Session types |
(Girard 1987) | (Honda 1993) |
UTLC functions have no fixed domain and range.
Both Curry and Church came up with typed versions of their lambda calculus systems:
Curry | Church |
---|---|
Implicit types | Explicit types |
'Extrinsic typing' | 'Intrinsic typing' |
Types assigned later | Type assignment 'built-in' |
λx.x | λx:ℕ. x:ℕ |
Came first | Came later |
https://ncatlab.org/nlab/show/intrinsic+and+extrinsic+views+of+typing
CoC (and hence CIC) is in Church style.
type := Type
| type → type
(→
associates to the right.)
Values of Type
are type variables.
where all τᵢ are of values of type type
.
∪, ∩ are commutative and associative
∪ distributes over ∩, ∩ distributes over ∪
a ∪ 0 = a, a ∩ 1 = a
0 is like False / fallacy.
1 is like True / tautology.
Ref: Classical Brouwer-Heyting-Kolmogorov interpretation - Sata, Masahiko: https://repository.kulib.kyoto-u.ac.jp/dspace/bitstream/2433/61685/1/1021-3.pdf
Named after the logician Charles Peirce.
((p -> q) -> p) -> p
Truth table:
| p | q | p -> q | (p->q) -> p | ((p -> q) -> p) -> p |
|---+---+--------+-------------+----------------------|
| 0 | 0 | 1 | 0 | 1 |
| 0 | 1 | 1 | 0 | 1 |
| 1 | 0 | 0 | 1 | 1 |
| 1 | 1 | 1 | 1 | 1 |
Peirce's law is equivalent to law of excluded middle when q
is made ¬p
:
| ((p -> ¬p) -> p) -> p | |
| ((¬p ∨ p) -> p) -> p | (p->q) ≡ (¬p ∨ q) |
| (¬(¬p ∨ p) ∨ p) -> p | (p->q) ≡ (¬p ∨ q) |
| ((p ∧ ¬p) ∨ p) -> p | ∨ distr, double negation |
| ¬((p ∧ ¬p) ∨ p) ∨ p | (p->q) ≡ (¬p ∨ q) |
| (¬(p ∧ ¬p) ∧ ¬p) ∨ p | double negation |
| ((¬p ∨ p) ∧ ¬p) ∨ p | double negation |
| ((¬p ∧ ¬p) ∨ (p ∧ ¬p)) ∨ p | ∧ distr |
| ¬p ∨ p | double negation |
| p ∨ ¬p | ∨ comm |
Assumes double negation rule (ie, ¬¬p ≡ p). But it's classical logic anyway, since we are deriving law of excluded middle.
https://en.wikipedia.org/wiki/Minimal_logic
Doesn't have:
p ∨ ¬p
)∀x, ∃y
, means x is fixed first, and y is chosen later.∃y, ∀x
, means y is fixed first, and x is chosen later.For example,
∀x:country, ∃y:city,
y = capital(x)
means 'every country has a city as its capital'
∃y:city, ∀x:country,
y = capital(x)
means 'there is some city which is the capital of all countries'.
A style of formal logical argumentation in which every line of proof is a conditional tautology (known as sequent).
Left ∧ rule (L ∧):
Γ,A∧B ⊢ Δ
=========
Γ,A,B ⊢ Δ
Left ∨ rule (L ∨):
Γ,A∨B ⊢ Δ
=================
Γ,A ⊢ Δ Γ,B ⊢ Δ
Left → rule: (DOUBT: How does this one come about??)
Γ,A→B ⊢ Δ
=========
Left ¬ rule: (DOUBT: How does this one come about??)
Γ,¬A ⊢ Δ
========
Γ ⊢ Δ,A
If a formula A
appears as a conclusion of one proof and appears as a hypothesis of another proof, a third proof can be deduced in which the formula A
does not even appear (ie, A
is eliminated).
Γ ⊢ A,Δ Γ',A ⊢ Δ'
======================
Γ,Γ' ⊢ Δ,Δ'
Reference: Compiling with continuations - Andrew Appel
Informally speaking,
A proposition is a 'sentence' for which 'it makes sense' to ask if it is true or false.
Reference: Applied abstract algebra (2e) - Rudolf Lidl, Günter Pilz
An example ¹⁶:
Formula: Every philosopher has written at least one book.
ie, for each philosopher, there exists a book that is written by that philosopher.
∀p, Philo(p) -> (∃b [book(b) ∧ write(p, b)])
Rewriting the implication,
∀p, ¬ Philo(p) ∨ (∃b [book(b) ∧ write(p, b)])
There exists a b for a given p means we can have a function g that maps any given p to its b.
This part is the skolemization.
∀p, ¬ Philo(p) ∨ [book(g(p)) ∧ write(p, g(p))]
Another example ¹⁶:
∀p ∀s, philo(p) ∧ studentOf(p, s) -> ∃b, book(b) ∧ write(p, b) ∧ read(s, b)
Rewriting the implication,
∀p ∀s, ¬ [philo(p) ∧ studentOf(p, s)] ∨ [∃b, book(b) ∧ write(p, b) ∧ read(s, b)]
From de-Morgan's law,
∀p ∀s, [¬philo(p) ∨ ¬studentOf(p, s)] ∨ [∃b, book(b) ∧ write(p, b) ∧ read(s, b)]
There exists a book for a given philosopher p that one of his students s read. We can have a function g mapping a given p and s to a b.
Skolemizing,
∀p ∀s, [¬philo(p) ∨ ¬studentOf(p, s)] ∨ [book(g(p,s)) ∧ write(p, g(p,s)) ∧ read(s, g(p,s))]
Proposed by Jean-Yves Girard.
⊸ | linear implication | aka lollipop |
⊗ | multiplicative conjuction (tensor) | ⨂ |
⅋ | multiplicative disjunction | |
⊕ | additive disjunction | ⨁ |
tm := p | ¬p
| p ⊕ p | p ⊗ p
| p & p | p ⅋ p
| 0 | 1
| ⊤ | ⊥
| !p | ?p
Nature of connectives:
Names:
Lollipop is a derived connective: p ⊸ q ≡ ¬p ⅋ q
x = y iff P(x) = P(y) for all predicates. ʷ
From 22.2 of TAPL book (page 319):
λf:X→X.
λa:X.
f (f a)
where X
is a type variable.
Two abstractions. So, two arguments:
f : X → X
a : X
Final value type: X
As in
f (f a)
| | |
| +--+
| |
| X
| |
+----+
|
X
So the type of the initial term is:
(X → X) → X -> X
| | | |
+-----+ | |
| | |
f a return value
Holding type variable abstract like this allows for parametric polymorphism.
Makes smallest commitment about the values of type variables that yields a well-typed term.
Consider a term:
λf:Y.
λa:X.
f (f a)
This by itself can't be said to be well typed. Because f
is of type Y
when it has got to accept a : X
as argument.
But if we change Y
to be X → X
, to make it
λf:X → X.
λa:X.
f (f a)
we could say that the term is well-typed, but without yet giving a concrete value for the type variable X
. Its type is: (X → X) → X → X
.
So, the latter is the 'most general instance' of the former. Smallest commitment regarding values of type variables that yields a well-typed term.
We could say that type inference (aka type reconstruction) is the search for a valid instantiation of type variables.
λf:Y .
λa:X .
f (f a)
x is a fixed point of a function f if:
f(x) = x
Example:
Function | Fixed points |
---|---|
f(x) = x² | 0, 1 |
f(x) = x³ | -1, 0, 1 |
f(x) = x | ∀x |
Any function definable in lambda calculus has a fixed point, which itself is definable in labmda calculus. ⁸
Informally speaking, Church-Turing thesis says:that both Turing machines and lambda calculus represent computations.
Reference: https://www.cs.cornell.edu/courses/cs3110/2021sp/textbook/adv/curry-howard.html
Related to lambda calculus.
If two different β-reductions upon a term terminate, then they terminate in the same term.
e1
↗ ↘
e e'
↘ ↗
e2
References:
From here:
B
|
+-------+------+
| |
R B
| |
+----+----+ +-----+
| | |
B B R
Lexical analysis | Finite state machine |
Syntax analysis | Push down automata |
Code selection | Tree automata |
Basically the state diagrams used in model checking books.
Pronounced /ˈkrɪpki/
.
Apparently the same?? Not sure…
Functions which are known to terminate.
As in a loop whose maximum possible number of iterations is known before entering the loop.
(Gödel called this just 'recursive functions'. Kleene kind of renamed it.)
Can model distributed systems.
Named after German computer scientist Carl Adam Petri. ʷ
A directed bipartite graph
Two kinds of nodes
Arcs connect places and transitions.
A place can contain tokens (black small circles within place).
A transition is enable if all places connected as input to it has at least one token each.
Non-deterministic behaviour.
See some example petrinets here.
Definition:
(P, T, F)
where:
Variants of Petrinet:
Move registers to decrease the time for the critical path so that frequency may be increased.
Retiming is possible only when there is sufficient slack.
+--+ /-----\ /-----\ +--+ +--+
|R1|---| OP1 |->-| OP2 |->-|R2|->-|R3|
+--+ \-----/ \-----/ +--+ +--+
10ns 20ns
where the 'R's are registers.
In the above case, R1-R2 takes 10ns + 20ns = 30ns whereas R2-R3 takes 0ns (ie, we have a positive slack of 20ns in R2-R3??).
So one clock should be of at least 30ns.
That means ν = 1/(30 * 10⁻⁹) = 33.33MHz
Yet, if OP2 were moved to R2-R3,
+--+ /-----\ +--+ /-----\ +--+
|R1|---| OP1 |->-|R2|->-| OP2 |->-|R3|
+--+ \-----/ +--+ \-----/ +--+
10ns 20ns
one clock need be only at least 20ns.
∴ ν = 1/(20 * 10⁻⁹) = 50MHz.
ie, we can get higher frequency of operation.
| | | | | | |
5-|---|---|---|---|---|---|
| | | | | | |
4-|---|---|---|---|---|---|
| | | | | | |
3-|---|---|---|---▄---|---|
| | | | ▄▀| | |
2-|---|---|---▄▀--|---|---|
| | | ▄▀| | | |
1-|---|---▄▀--|---|---|---|
| | ▄▀| | | | |
0-|---|▀--|---|---|---|---|
0 1 2 3 4 5 6
(1,0) to (4,3)
| | | | | | | | | | | | | |
5-|---|---|---|---|---|---| 5-|---|---|---|---|---|---|
| | | | | | | | | | | | | |
4-|---|---|---|---|---|---| 4-|---|---|---|---|---|---|
| | | | | | | | | | | | | |
3-|---┏━━━━━━━━━━━━---|---| 3-|---|---|---┏━━━━---|---|
| ┃ | | | | | | | | ┃ | | |
2-|---┃---|---|---|---|---| 2-|---|---┏━━━┛---|---|---|
| ┃ | | | | | | | ┃ | | | |
1-|---┃---|---|---|---|---| 1-|---┏━━━┛---|---|---|---|
| ┃ | | | | | | ┃ | | | | |
0-|---┃---|---|---|---|---| 0-|---┃---|---|---|---|---|
0 1 2 3 4 5 6 0 1 2 3 4 5 6
| | | | | | | | | | | | | |
5-|---|---|---|---|---|---| 5-|---|---|---|---|---|---|
| | | | | | | | | | | | | |
4-|---|---|---|---|---|---| 4-|---|---|---|---|---|---|
| | | | | | | | | | | | | |
3-|---|---|---┏━━━━---|---| 3-|---|---|---|---┃---|---|
| | | ┃ | | | | | | | ┃ | |
2-|---|---|---┃---|---|---| 2-|---|---|---|---┃---|---|
| | | ┃ | | | | | | | ┃ | |
1-|---┏━━━━━━━┛---|---|---| 1-|---|---|---|---┃---|---|
| ┃ | | | | | | | | | ┃ | |
0-|---┃---|---|---|---|---| 0-|---━━━━━━━━━━━━┛---|---|
0 1 2 3 4 5 6 0 1 2 3 4 5 6
(1,0) to (4,3)
The name comes from Manhattan, New York where the buildings were arranged in a grid layout and the shortest path a taxicab could take between two locations was precisely the 'Manhattan distance'. [¹][1]
It's like all the squares are buildings and the lines of the grid are roads.
To navigate between the different road junctions, a taxi would have to go around the 'buildings'.
Hence the nick name 'taxicab distance', I guess.
+---+---+---+---+---+---+---+---+
| 3 | 3 | 3 | 3 | 3 | 3 | 3 | 4 |
|---+---+---+---+---+---+---+---+
| 3 | 2 | 2 | 2 | 2 | 2 | 3 | 4 |
|---+---+---+---+---+---+---+---+
| 3 | 2 | 1 | 1 | 1 | 2 | 3 | 4 |
|---+---+---+---+---+---+---+---+
| 3 | 2 | 1 | 𝐾 | 1 | 2 | 3 | 4 |
|---+---+---+---+---+---+---+---+
| 3 | 2 | 1 | 1 | 1 | 2 | 3 | 4 |
|---+---+---+---+---+---+---+---+
| 3 | 2 | 2 | 2 | 2 | 2 | 3 | 4 |
|---+---+---+---+---+---+---+---+
| 3 | 3 | 3 | 3 | 3 | 3 | 3 | 4 |
|---+---+---+---+---+---+---+---+
| 4 | 4 | 4 | 4 | 4 | 4 | 4 | 4 |
+---+---+---+---+---+---+---+---+
See:
https://xlinux.nist.gov/dads/HTML/lmdistance.html
https://cacm.acm.org/research/the-silent-revolution-of-sat/
–
From Handbook of Satisfiability by Armin Biere, Marijn Heule, Hans van Maaren, Toby Walsh:
In its broadest sense, a pseudo-Boolean function is a function that maps n Boolean values to a real number. … … Pseudo-Boolean constraints offer a more expressive and natural way to express constraints than clauses and yet, this formalism remains close enough to the SAT problem to benefit from the recent advances in SAT solving.
Σ aⱼlⱼ operator b
j
where
- aⱼ,b ∈ ℤ
- lⱼ are literals
- operator ∈ {=, ≥, ≤, <, >}
RHS of the constraint (ie, b
here) is known as the degree of the constraint.
PB: Pseudo-boolean
QBF: Quantified boolean formula
LPB: Linear pseudo-boolean
Literal
Clause
a ∨ x b ∨ x̅
─────────────────────
a ∨ b
where a ∨ b
is the resolvent over x
of a ∨ x
and b ∨ x̅
.
Because we need both a ∨ x
and b ∨ x̅
to be true.
3 variables => 8 possibilities:
p = (a∨x) | q = | |||||||
a | x | b | x̅ | a ∨ x | b ∨ x̅ | ∧ (b∨x̅) | a ∨ b | p → q |
0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 |
0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 1 |
0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 |
1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 |
1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 |
1 | 1 | 0 | 0 | 1 | 0 | 1 | 1 | 1 |
1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 |
import sage.logic.propcalc as propcalc
sage: = propcalc.formula("((a|x) & (b|(~x))) -> (a|b)")
sage: f
sage: f.truthtable()
a x b valueFalse False False True
False False True True
False True False True
False True True True
True False False True
True False True True
True True False True
True True True True
—
Consider a CNF from Handbook of satisfiability (2e, Chapter 7):
(x ∨ y) ∧ (x ∨ y̅ ∨ z) ∧ (x̅ ∨ z) ∧ (y ∨ z̅) ∧ (y̅ ∨ z̅)
Each clause is something that must necessarily hold (ie, like an axiom).
1 | (x ∨ y) | Axiom |
2 | (x ∨ y̅ ∨ z) | Axiom |
3 | (x̅ ∨ z) | Axiom |
4 | (y ∨ z̅) | Axiom |
5 | (y̅ ∨ z̅) | Axiom |
6 | (y ∨ z) | Res(1,3) |
7 | (y̅ ∨ z) | Res(2,3) |
8 | z | Res(6,7) |
9 | z̅ | Res(4,5) |
10 | ⊥ | Res(8,9) |
Since ⊥ is derivable, this CNF is unsat.
(From computerphile video on UTF-8)
ASCII
8-bit codes
Unicode:
UTF-8:
0.0.0.0
: local network
127.0.0.1
: common loopback address
127.0.0.0
to 127.0.0.255
range is available for loopback addresses::1/128
v4 | v6 |
---|---|
32 bits | 128 bits |
4 numbers | 8 numbers |
decimal numbers | hexadecimal numbers |
0-255 numbers | 0x0-0xffff numbers |
127.0.0.1 | ::1 |
CIDR mask | Prefix length |
Mask: 0-32 | Prefix: 0-128 |
https://stackoverflow.com/questions/1726802/what-is-the-difference-between-yaml-and-json
Super and subscript
Strikeout: Enclose with a pair of ~~
Link:
@inl
: make 'inl' bold
—
See:
Sample: