Want Housing at Scale? Stop Treating Land Data Like Insider Information

Trying to understand land ownership in Ontario feels less like research and more like starring in a low-budget CBC crime drama (only on CBC Gem). I moved from Montréal expecting land data and instead got paywalls, outdated systems, and enough opacity to make Bay Street proud.

Now zoom out. Because this isn’t just a data problem. It’s a scale problem.

Canada’s housing crisis isn’t a niche market inefficiency you can patch with a few well-capitalized developers and a handful of rezonings, which aren’t being rezoned (see Calgary & Toronto). It’s a structural supply failure measured in the millions of homes (3.5 million by 2030 to be exact). And yet, we continue to operate as if a small club of corporate developers, working parcel by parcel, variance by variance, can close that gap. They can’t.

Even if every major developer in Ontario doubled output tomorrow, you’d still be nowhere near the level of production required to restore affordability. The math simply doesn’t math. A constrained system cannot produce unconstrained supply. And this is where Ontario’s opacity becomes more than just frustrating. It becomes economically dangerous.

Because when land data is hidden, outdated, or prohibitively difficult to access, you don’t just slow down researchers (see Housing Assessment Resource Tool). You throttle the entire ecosystem of potential housing providers: small builders, co-operatives, non-profits, community land trusts, and even informed homeowners. You effectively gatekeep participation in the land market to those with the capital, time, and insider knowledge to navigate it.

Meanwhile, in Montréal, you can sit down with an overpriced coffee in the Plateau, open up municipal assessment rolls, and within minutes understand ownership, valuation, taxation, and transaction history with a level of clarity that feels… suspiciously like good governance. It’s not perfect, but it’s functional. It treats land information as public infrastructure, not proprietary intelligence because it was built on the fear of European foreign ownership. Ontario, by contrast, treats land data like nuclear launch codes. The system run by Municipal Property Assessment Corporation operates with all the transparency of a black box wrapped in a NDA.

And the results? Predictably distorted. A Toronto Star investigation found that 69% of the lowest-value homes were overassessed, while 47% of the highest-value homes were underassessed. On top of that, Ontario municipalities (read: tax payers) collectively pay roughly $231 million a year to fund MPAC. Apparently we’ve now privatized land data the same way we privatized Highway 407: the public pays for the infrastructure, but accessing it still feels like a premium subscription service. If you were trying to design a system that advantages established property owners while limiting new entrants, you’d be hard-pressed to do better.

So as long as access to land, both physically and informationally, remains constrained, the supply outcomes won’t change. At some point, we need to stop asking how to optimize a broken system and start asking how to open it up. Because you cannot solve a housing shortage at scale with scarcity logic.

And you certainly can’t solve it blind.

For legal reasons everything I post is a joke.

Next
Next

We Killed the Development Charges & the Pro Forma Still Died