16 Operators: The Complete exp-ln Census

Tier: OBSERVATION (computed) + THEOREM (completeness classification, partial)

The EML family starts with one idea: combine exp(x) and ln(y) using arithmetic. There are exactly 16 natural binary combinations. This post classifies all of them.


The 16 Operators

Every binary exp-ln operator has the form: combine exp(±x) with ln(y) using one of {−, +, ×, ÷, ^}.

Subtraction family (EML, EMN, DEML, DEMN):

OperatorFormulaf(1,2)Complete?
EMLexp(x) − ln(y)2.025YES — T02 foundation
EMNln(y) − exp(x)−2.025APPROXIMATE — T24
DEMLexp(−x) − ln(y)−0.325NO — T13
DEMNln(y) − exp(−x)0.325NO

Addition family (EAL, DEAL):

OperatorFormulaf(1,2)Complete?
EALexp(x) + ln(y)3.411YES — add(x,y)=3n
DEALexp(−x) + ln(y)1.061NO

Multiplication family (EXL, DEXL):

OperatorFormulaf(1,2)Complete?
EXLexp(x) · ln(y)1.884YES — optimal: ln=1n, pow=3n
DEXLexp(−x) · ln(y)0.255NO

Division family (EDL, DEDL):

OperatorFormulaf(1,2)Complete?
EDLexp(x) / ln(y)3.922YES — div=1n
DEDLexp(−x) / ln(y)0.531NO

Power family (EPL, DEPL):

OperatorFormulaf(1,2)Complete?
EPLexp(x) ^ ln(y)2.000YES
DEPLexp(−x) ^ ln(y)0.500NO

Reversed-argument family:

OperatorFormulaf(1,2)Complete?
LEXln(exp(x) − y)−0.331NO — undefined when exp(x) ≤ y
LEAdln(exp(x) + y)1.551YES — softplus = 1 node
ELAdexp(x + ln(y))5.437YES — equals y·exp(x)
ELSbexp(x − ln(y))1.359YES — equals exp(x)/y

The Structural Insight

8 complete, 1 approximate, 7 incomplete.

The pattern is clear: negating the exponent breaks completeness.

All 5 operators with exp(−x) — DEML, DEMN, DEAL, DEXL, DEDL, DEPL — are incomplete. The only incomplete operator without exp(−x) is LEX, which fails because it is undefined on a non-negligible domain.

Why does exp(−x) break completeness? The range of exp(−x) is (0,∞) — identical to exp(x). But when negated, DEML(x,y) = exp(−x) − ln(y) is bounded above by exp(−x) which decreases as x grows. This prevents DEML trees from growing large, limiting their ability to represent functions with unbounded output (like ln(x) itself).


Softplus: The Hidden 1-Node Result

Among the reversed-argument operators, LEAd computes:

LEAd(x, y) = ln(exp(x) + y)

Setting y = 1: ln(1 + exp(x)) = softplus(x) in exactly 1 EML-family node.

Softplus is used throughout machine learning as a smooth approximation to ReLU. The fact that it costs 1 node (not 4-5 as commonly assumed) is a new result.

FunctionExpected costActual EML cost
ReLU(x) = max(x,0)∞ (not elementary)
Softplus(x) = ln(1+eˣ)~4n1n via LEAd
Sigmoid(x) = 1/(1+e^{−x})~4n2n (recip + DEML)

Reproduce

python python/scripts/research_new_operators.py

Results in python/results/new_operators_results.json.


Cite: Monogate Research (2026). “16 Operators: The Complete exp-ln Census.” monogate research blog. https://monogate.org/blog/sixteen-operators

React