HiQ Cortex
中文 Open Chat

About · HiQ Cortex

Slow, on purpose.

HiQ Cortex is a thin intelligence layer above every LCA database — built to compress years of expert judgment into one conversation, without cutting any of the corners that make the answer worth trusting.

Cornu aspersum, rendered with its house on its back and its trail behind it. The animal is the argument: slow, self-documenting, legible in the evidence it leaves.

The company

§ I

HiQ-AI, in one paragraph.

HiQ-AI is a Hong Kong–registered technology company building data infrastructure and intelligence tools for the life cycle assessment industry. We maintain HiQLCD — a structured LCA emissions database covering materials, processes, and supply-chain proxies — and we build Cortex on top of it. Our two products are Cortex Chat, a conversational retrieval and reasoning interface for LCA practitioners, and Cortex Cowork, a desktop agent that manages multi-session LCA projects locally. Both products are built for independent LCA consultants and corporate sustainability teams preparing CBAM filings, EPD reports, and Scope 3 disclosures.

Why we built it

§ II

The bottleneck was never data.

The global life cycle inventory landscape holds millions of records across dozens of databases. Every practitioner with an Ecoinvent license has more data than any single project requires. The bottleneck was never volume — it was reading the question right.

Translating a real production scenario into a search query; navigating between the terminology conventions of HiQLCD, Ecoinvent, and EF in a single session; deciding whether a proxy value from one database can stand in for a missing entry in another — this work has always depended on expert judgment. It is time-consuming, expensive to scale, and nearly impossible to audit after the fact.

Cortex is our answer. A thin intelligence layer sitting above the databases, methods, and calculation tools — compressing the chain from understand the question → retrieve candidates → cross-check → produce a traceable answer into one conversation.

It does not replace experts. It scales their judgment.

The mascot

§ III

We picked the opposite of a rocket.

When we chose a mascot for Cortex, we made a choice against the category's instincts.

The default AI image is a lightning bolt, a rocket, a neural dendrite — fast, forceful, omniscient. LCA has never been that kind of discipline. A single CBAM filing can involve weeks of system-boundary debate, BOM decomposition, and proxy judgment calls across a dozen databases. Every step needs to be auditable, questionable, traceable back to its source. What clients actually need is not "answered fast" — it is "answered correctly, with a clear account of why."

A snail is the shape of that requirement. So we chose a snail.

Slow

The most visible thing about a snail is that it does not rush. It will not skip the system-boundary discussion to produce a number sooner. It will not collapse three competing proxy values into one without flagging the disagreement. On the backend, Cortex does the fastest things available: parallel retrieval across twelve databases, concurrent tool calls, LLM inference at scale. What it delivers is always the slowest thing: a chain of reasoning you can audit, line by line, assumption by assumption.

Slow is the most important promise we can make.

Shell

A snail carries its shelter on its back. Wherever it goes, it has its own structure with it. Cortex works the same way: every cited dataset, every proxy assumption, every version-change note travels with the answer. There is no output without provenance.

The shell is not decoration — it is the evidence that the answer can be checked.

Trail

A snail leaves a visible mark on everything it crosses. Auditors, regulators, and internal reviewers need exactly that: a path from the final number back to the original question, passable in either direction. Cortex's trail is the audit log: which databases were searched, which candidates were returned, which thresholds paused the process, and what the practitioner decided at each gate.

The trail is the deliverable.

Built to connect

§ IV

Four protocols. One API key.

Cortex exposes its full retrieval and reasoning surface through four protocols: REST for direct HTTP integration, MCP for tool-call-compatible clients, AG-UI for streaming agent interfaces, and A2A for agent-to-agent handoffs. Every capability available in Cortex Chat and Cortex Cowork is reachable through the API — the products are built on the same surface they expose. Complete reference, with examples and authentication, is at docs.x.hiq.earth.

Work with us.

For questions about Cortex, API access, integration partnerships, or anything else.