Every project has a moment where it clicks. For Tokyo Pulse, it was when I dragged the time slider through a full 24-hour cycle and watched the flow particles surge during rush hour, then slow to a crawl at midnight — suddenly the visualization wasn’t showing data anymore. It was showing a city breathing.

This post is about how I got there.

The Architecture Decision

The first real decision was: how do you put interactive data art on top of a real map?

I tried three approaches:

Pure Canvas — Fast, but you lose all the built-in interactivity. No hover states, no click events without manual hit detection. Felt like building a game engine just to show 23 dots.

Pure SVG with D3 — Beautiful transitions and events, but no real map underneath. The wards floated in abstract space. It looked like a network diagram, not a city.

Leaflet + D3 SVG overlay — This is what worked. Leaflet handles the map tiles, zoom, and pan. D3 handles the force simulation and all the data-driven elements. Canvas handles the high-frequency animations (flow particles, cursor glow) that would choke SVG.

Three rendering layers, each doing what it’s best at.

The Force Simulation

D3’s force simulation is the heart of the piece. Each ward is a node with its real latitude and longitude. The simulation applies four forces simultaneously:

A geographic pull gently tugs each node toward its real position on the map. A link force connects adjacent wards with invisible springs. A charge force pushes all nodes apart to prevent overlap. And a collision force enforces minimum spacing based on node size.

The result is that nodes float near their geographic positions but aren’t locked to them. They have room to breathe, to respond to dragging, to rearrange when you switch data modes.

The simulation never fully stops. Every few seconds, micro-perturbations keep the nodes gently drifting — like a city that’s never completely still.

Making It Feel Alive

Static data points don’t feel alive. Several layers of animation work together to create that sense:

Ambient pulse — Glow circles behind each node slowly breathe, fading between bright and dim on a sinusoidal curve.

Flow particles — About 150 tiny dots travel along the links between wards, leaving trailing afterimages. They travel in both directions, at varying speeds, creating a circulation effect.

Time cycle — The slider doesn’t just change a number. At rush hour (7–9AM, 5–7PM), flow particles triple their speed and brighten. At night, the map darkens and node glows intensify against the darkness. The city’s daily rhythm becomes visible — from the stillness of 3AM to the chaos of the morning commute to the evening exhale.

Cursor glow — A warm radial gradient follows your mouse, rendered on a separate canvas with screen blend mode. It’s subtle — you might not consciously notice it — but it makes the entire surface feel responsive to your presence.

Film grain — A full-screen SVG noise filter at 4% opacity adds texture to everything. Without it, the visualization looks like a software product. With it, it feels like a photograph.

The Radial View

Halfway through development, I added a toggle that strips away geography and arranges the wards in a circle sorted by density. It was originally a debugging tool — I wanted to see if the force simulation could handle a completely different layout.

But the radial view revealed something the map couldn’t: the dramatic gap between Toshima (23,906 people per km²) and Chiyoda (5,909). On a map, those wards are both “central Tokyo.” In the radial view, they’re at opposite ends of the circle.

Same data. Different truth. I kept the toggle.

What I’d Do Differently

If I built this again, I’d start with the time slider. The day/night cycle is the feature that makes people pause and explore. It should have been the first thing I designed around, not something added later.

I’d also spend more time on the ward boundaries. Right now the nodes float above a CartoDB Dark Matter tile layer. Ideally, each ward would have its actual boundary polygon rendered as a subtle shape that highlights on hover — connecting the abstract nodes to the concrete geography beneath them.

And I’d add sound. The data is rhythmic enough to drive ambient audio — density mapped to pitch, flow speed to tempo. That’s a future project.

Data Source

All population, area, and density figures come from the Tokyo Metropolitan Government’s Bureau of General Affairs, Statistics Division (東京都総務局統計部), based on the October 2025 Basic Resident Register (住民基本台帳). Ward adjacencies follow real administrative boundaries.

Try It

Experience Tokyo Pulse →

Press K for kiosk mode (gallery display). Drag the time slider. Click a ward. Switch to radial view. Drag a node and watch the physics respond.

The artist statement has technical specs and an embed code if you’d like to feature it.


Simba Hu is a data and AI strategist based in Tokyo who builds data infrastructure by day and data art by night. Find him at simbahu.com.