Issue 001 Spring MMXXVI Greenpoint, Brooklyn

The Growth 001

A field journal from a decommissioned convent in Greenpoint, Brooklyn — where the garden is on the roof and the hardware is in the basement.

§ I — Letter from the Editors

A letter from the second floor.

Dear reader — welcome to The Growth. The Growth is a Local AI magazine published by The Grove, and our way of catching, between covers, everything that happens inside the building: the meetups, the demos, the projects half-built upstairs, the people who come through, the conversations the boilers don't drown out. We work out of a decommissioned convent in Greenpoint, Brooklyn — the garden on the roof, the hardware in the basement, the East River close enough that you can walk to it from Williamsburg in ten minutes. Researchers, builders, artists, and scientists share the building; The Grove is what happens when we share the infrastructure too.

And the deepest piece of infrastructure we share is the local network. When you are building connectivity between pieces of software, why go all the way out to a remote server when the person you want to talk to is right next to you? People have been putting LAN- and peer-to-peer seams into software for decades — Bonjour, Magic Wormhole, half a dozen now-dead chat apps that ran over the local subnet — and most of the attempts miss because most users aren't on the same campus as each other. We are. Sharing a building is what makes the proposition tractable. In each project covered here you'll find some thread that runs over a peer-to-peer network instead of a server: transcription that merges across every laptop in the room, a generative stage that responds to whoever is closest to the mic, a search that queries the open internet and the local subnet, and more. The cloud is a place we visit on purpose; the local net is a place we live.

This is the inaugural issue. The pieces begin with an argument we keep coming back to — that the most interesting software, right now, is the kind that runs on the desk in front of you — and end with the program we are about to open applications for. In between sits a record of one season: a meetup we host on the second floor, a pipeline we are building in the east studio, a transcription tool we wrote when no good local one existed, and the compute co-op down the hall.

None of it is for sale. We hope you keep the journal. If you want to come by, the next meetup is on Saturday — full details, and an RSVP, are pinned at the end of § III.

— the editors, on a Tuesday after dinner.

§ II — Argument

In defense of software that stays home.

Local-first is not a feature. It is a way of being neighborly with your own tools.

A piece of software that lives on your computer is a piece of software you can break, fix, and outlive. It does not ask permission to start. It does not phone home to confirm you are still allowed to open the file you wrote yesterday afternoon. The garden is here, on this desk; the cloud is a place we visit on purpose.

The argument for local AI extends this register by one syllable. A model that runs on your laptop is yours in the way your file is yours: the audio you feed it never leaves the room, the response arrives in forty milliseconds rather than fifteen hundred, and no policy update on a faraway server can quietly make it forget how to do the thing you depended on it for last week. Cost falls out of the same fact — you have already bought the silicon, and it does not bill you per token.

There is a quieter argument too. A model that has lived on your machine for a year has heard your half of every conversation you handed it; it has learned which words you use that other people don't. That is either an intimacy or a leak depending on where the model lives. We prefer the version where the intimacy stays yours.

And there is the question of trust. Software written to run at internet scale — between strangers, across hostile networks — has to defend itself against sybil attacks, against equivocation, against the slow suffocating grind of distributed consensus. Local-first software has all the same problems in principle. It also has a much simpler answer when the people on the other end are sitting at the same table: you can recognize each other. Trust that comes from being in the same room is older, cheaper, and harder to forge than trust that comes from a signature scheme.

The Grove builds local-first because it is the only register in which we can plausibly keep our tools available to us in ten years, and because it makes the bill arrive at one address.

§ III — Field Note · Local AI Meetup

Local AI Meetup, by lamplight.

A monthly gathering on the second floor — Brooklyn's quiet local-AI meetup, where Greenpoint meets Williamsburg. Demos run on consumer hardware. Bring a laptop, an extension cord, an open question.

The Local AI Meetup in Brooklyn — the ornate convent hall mid-event, pink-and-cream ceiling projections, a crowd in conversation, a projector cycling through a slide. Greenpoint, NYC.
I. Mid-event. Ceiling projections in cream and pink; the screen on the far wall cycling through somebody's slides.
A local AI demo at the Brooklyn meetup — three people gathered around a laptop in a side study, a houseplant on the radiator, warm lamp light. The Grove, Greenpoint.
II. A demo in the side study, half an hour after dinner. The model was small enough to fit on the table.
Two attendees of the Brooklyn local AI meetup laughing on a striped armchair in the lounge, cool ambient light, an EXIT sign in the background. The Grove, Greenpoint, NYC.
III. The couch nearest the lamp, somewhere past the second hour. The conversation had wandered far from where it started.

The meetup runs once a month, always on a weekday, always after dinner. People walk in from Greenpoint, Williamsburg, and the rest of north Brooklyn — sometimes from across the East River. The overhead fluorescents hum at a frequency that is hostile to thinking, so we kill them and light the room with desk lamps and a torchière a neighbor donated when she moved upstate. People bring whatever hardware they own — a Mac mini, a borrowed 4090, a tower someone built on the roof during the February storm. About half the demos do not work. The half that do are usually small, surprising, and not what their authors expected.

Benchmarks are not the point. The point is what happens when a model is small enough to live on a desk in Brooklyn, and what people choose to do with one once it belongs to them. If you are looking for the local AI meetup in NYC, you have found it.

§ IV — Project Profile

Etherea: create as you speak.

A live performance system that runs in the convent's main hall — same building as the meetup, different floor. The audience passes the mic; an LLM-orchestrated pipeline turns their words into a moving visual world. No two shows alike.

IV. An image generated from the sentence “the room remembered itself.”
V. The stage, mid-set. The mic is the prompt; the projector is the canvas; the audience is the lighting director.

Etherea is the live storytelling system that runs out of the same convent we do, on the floor with the high ceilings — a permanent generative stage where, in their own line, frontier art meets frontier models. An audience passes a microphone around a darkened room; the words they speak are orchestrated through a stack of language and image models into a moving visual world projected behind the speakers. The tagline is the working theory — create as you speak — and the room, when it works, behaves like a story everyone is writing at the same time.

No two shows are the same — because no two audiences imagine the same thing.

— from withetherea.com

It is part performance, part participatory art. The system also tours: over nearly two years, thousands of people in different cities have passed the mic in front of it and watched their half-sentences become weather.

The lag between voice and image is the standing design question. Faster, and the system reads as a slot machine. Slower, as a séance. The seam between the two is where the work has been good — and the room tightens or loosens it depending on whether the next person in the circle has ever held a microphone before.

§ V — Tool Profile

VoxTerm: a transcript that stays in the room.

A terminal app that listens to a meeting, names the voices, and shares notes between the laptops in the same room — over LAN, never the internet.

VI. The interface, mid-meetup. Speakers learn their colors after a couple of sessions; party-mode peers see the same colors on their own laptops.

VoxTerm is the transcription tool we wrote when we kept ending the meetup wishing we had one. A microphone catches the room, an open-weights model on the laptop turns it into text, and a small daemon recognizes the voices it has heard before. None of it leaves the device — the audio is never written to disk, only the transcript is, in markdown, in a folder you choose.

We made it a TUI on purpose. The terminal is the most platform-agnostic surface still in good standing — runs on any machine that runs anything, needs no renderer, and saves us from futzing with whichever heavy UI framework is fashionable this season. The form is also having a moment: lazygit, yazi, and a quiet wave of apps built on ratatui and textual have turned text-based interfaces — which date to the early 1970s — into one of the better threads on Hacker News this year.

There is a party mode. Press N on two or more laptops on the same network and they discover each other over mDNS, share an encrypted color, and merge their transcripts in real time. Each laptop captures its closest speaker best, so a four-laptop party produces a transcript no single microphone could have. No relay servers; if you can ping it on the LAN, it can hear you.

VoxTerm is open source and lives at dmarzzz/voxterm. One curl command if you have an Apple Silicon machine and Python.

§ VI — Notes from down the hall

The Greenpoint Compute Co-op.

A single workstation, training a model for the cohort that funded it.

In the room next to the boiler-room stairs, an NVIDIA GB300 hums quietly through most of the day. It is the entire compute footprint of the Greenpoint Compute Co-op, a small group fine-tuning open-weights models on the activity of their own cohort — calendars, decisions, the things people end up agreeing to over a Tuesday dinner. The result is supposed to help that cohort coordinate. The premise, in their words: “a model that helps a specific group coordinate is the smallest honest instance of that work.”

Contributors fund the hardware and operations, get early access to the model, and are listed by name on the workstation itself. Donations are accepted in ETH or by bank transfer. The full pitch lives at wrld.nyc.

§ VII — What's coming next

Shape Rotator, soon.

An accelerator for people building things that run on the desk in front of them.

After this issue closes we are turning toward the Shape Rotator program — an accelerator for projects that share the same disposition as the work in this issue: small, local-first, opinionated, willing to be wrong on purpose. Cohorts will be a season long. Applications open at shaperotator.xyz; the first details land there before we publish issue 002.

[ed. — if you are reading this and the page is empty, you are early. that is the right time to write to us.]

Endnotes & references

  1. 1. The Grove. grov3.net. Greenpoint, Brooklyn.
  2. 2. ETHEREA. withetherea.com. The Convent, Greenpoint.
  3. 3. VoxTerm. github.com/dmarzzz/voxterm.
  4. 4. Greenpoint Compute Co-op. wrld.nyc.
  5. 5. Shape Rotator. shaperotator.xyz.
  6. 6. “Why TUIs are back,” Hacker News.
  7. 7. lazygit, jesseduffield/lazygit; yazi, sxyazi/yazi; ratatui, ratatui.rs; textual, textual.textualize.io.
  8. 8. Text-based user interface, Wikipedia.